首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Nanomaterials are finding application in many different environmentally relevant products and processes due to enhanced catalytic, antimicrobial, and oxidative properties of materials at this scale. As the market share of nano‐functionalized products increases, so too does the potential for environmental exposure and contamination. This study presents some exposure ranking methods that consider potential metallic nanomaterial surface water exposure and fate, due to nano‐functionalized products, through a number of exposure pathways. These methods take into account the limited and disparate data currently available for metallic nanomaterials and apply variability and uncertainty principles, together with qualitative risk assessment principles, to develop a scientific ranking. Three exposure scenarios with three different nanomaterials were considered to demonstrate these assessment methods: photo‐catalytic exterior paint (nano‐scale TiO2), antimicrobial food packaging (nano‐scale Ag), and particulate‐reducing diesel fuel additives (nano‐scale CeO2). Data and hypotheses from literature relating to metallic nanomaterial aquatic behavior (including the behavior of materials that may relate to nanomaterials in aquatic environments, e.g., metals, pesticides, surfactants) were used together with commercial nanomaterial characteristics and Irish natural aquatic environment characteristics to rank the potential concentrations, transport, and persistence behaviors within subjective categories. These methods, and the applied scenarios, reveal where data critical to estimating exposure and risk are lacking. As research into the behavior of metallic nanomaterials in different environments emerges, the influence of material and environmental characteristics on nanomaterial behavior within these exposure‐ and risk‐ranking methods may be redefined on a quantitative basis.  相似文献   

2.
Twenty-four-hour recall data from the Continuing Survey of Food Intake by Individuals (CSFII) are frequently used to estimate dietary exposure for risk assessment. Food frequency questionnaires are traditional instruments of epidemiological research; however, their application in dietary exposure and risk assessment has been limited. This article presents a probabilistic method of bridging the National Health and Nutrition Examination Survey (NHANES) food frequency and the CSFII data to estimate longitudinal (usual) intake, using a case study of seafood mercury exposures for two population subgroups (females 16 to 49 years and children 1 to 5 years). Two hundred forty-nine CSFII food codes were mapped into 28 NHANES fish/shellfish categories. FDA and state/local seafood mercury data were used. A uniform distribution with minimum and maximum blood-diet ratios of 0.66 to 1.07 was assumed. A probabilistic assessment was conducted to estimate distributions of individual 30-day average daily fish/shellfish intakes, methyl mercury exposure, and blood levels. The upper percentile estimates of fish and shellfish intakes based on the 30-day daily averages were lower than those based on two- and three-day daily averages. These results support previous findings that distributions of "usual" intakes based on a small number of consumption days provide overestimates in the upper percentiles. About 10% of the females (16 to 49 years) and children (1 to 5 years) may be exposed to mercury levels above the EPA's RfD. The predicted 75th and 90th percentile blood mercury levels for the females in the 16-to-49-year group were similar to those reported by NHANES. The predicted 90th percentile blood mercury levels for children in the 1-to-5-year subgroup was similar to NHANES and the 75th percentile estimates were slightly above the NHANES.  相似文献   

3.
Armand Maul 《Risk analysis》2014,34(9):1606-1617
Microbial risk assessment is dependent on several biological and environmental factors that affect both the exposure characteristics to the biological agents and the mechanisms of pathogenicity involved in the pathogen‐host relationship. Many exposure assessment studies still focus on the location parameters of the probability distribution representing the concentration of the pathogens and/or toxin. However, the mean or median by themselves are insufficient to evaluate the adverse effects that are associated with a given level of exposure. Therefore, the effects on the risk of disease of a number of factors, including the shape parameters characterizing the distribution patterns of the pathogen in their environment, were investigated. The statistical models, which were developed to provide a better understanding of the factors influencing the risk, highlight the role of heterogeneity and its consequences on the commonly used risk assessment paradigm. Indeed, the heterogeneity characterizing the spatial and temporal distribution of the pathogen and/or the toxin contained in the water or food consumed is shown to be a major factor that may influence the magnitude of the risk dramatically. In general, the risk diminishes with higher levels of heterogeneity. This scheme is totally inverted in the presence of a threshold in the dose‐response relationship, since heterogeneity will then have a tremendous impact, namely, by magnifying the risk when the mean concentration of pathogens is below the threshold. Moreover, the approach of this article may be useful for risk ranking analysis, regarding different exposure conditions, and may also lead to improved water and food quality guidelines.  相似文献   

4.
Climate change may impact waterborne and foodborne infectious disease, but to what extent is uncertain. Estimating climate‐change‐associated relative infection risks from exposure to viruses, bacteria, or parasites in water or food is critical for guiding adaptation measures. We present a computational tool for strategic decision making that describes the behavior of pathogens using location‐specific input data under current and projected climate conditions. Pathogen‐pathway combinations are available for exposure to norovirus, Campylobacter, Cryptosporidium, and noncholera Vibrio species via drinking water, bathing water, oysters, or chicken fillets. Infection risk outcomes generated by the tool under current climate conditions correspond with those published in the literature. The tool demonstrates that increasing temperatures lead to increasing risks for infection with Campylobacter from consuming raw/undercooked chicken fillet and for Vibrio from water exposure. Increasing frequencies of drought generally lead to an elevated infection risk of exposure to persistent pathogens such as norovirus and Cryptosporidium, but decreasing risk of exposure to rapidly inactivating pathogens, like Campylobacter. The opposite is the case with increasing annual precipitation; an upsurge of heavy rainfall events leads to more peaks in infection risks in all cases. The interdisciplinary tool presented here can be used to guide climate change adaptation strategies focused on infectious diseases.  相似文献   

5.
Melamine contamination of food has become a major food safety issue because of incidents of infant disease caused by exposure to this chemical. This study was aimed at establishing a safety limit in Taiwan for the degree of melamine migration from food containers. Health risk assessment was performed for three exposure groups (preschool children, individuals who dine out, and elderly residents of nursing homes). Selected values of tolerable daily intake (TDI) for melamine were used to calculate the reference migration concentration limit (RMCL) or reference specific migration limit (RSML) for melamine food containers. The only existing values of these limits for international standards today are 1.2 mg/L (0.2 mg/dm2) in China and 30 mg/L (5 mg/dm2) in the European Union. The factors used in the calculations included the specific surface area of food containers, daily food consumption rate, body weight, TDI, and the percentile of the population protected at a given migration concentration limit (MCL). The results indicate that children are indeed at higher risk of melamine exposure at toxic levels than are other groups and that the 95th percentile of MCL (specific surface area = 5) for children aged 1–6 years should be the RMCL (0.07 mg/dm2) for protecting the sensitive and general population.  相似文献   

6.
An ecological risk assessment (ERA) was conducted as part of the Baseline Risk Assessment of the Remedial Investigation (RI) for the Baxter Springs/Treece subsites, Cherokee County, Kansas Superfund site, a former metals mining site. Chemicals of potential concern were heavy metals associated with mine wastes and with base metal ore deposits that were characteristic of this area. An EPA-approved method was used to developed site-specific ambient water quality criteria. Ecological impacts were assessed using three complimentary approaches. First, potential chronic impacts were assessed by applying the toxicity quotient approach (i.e., a comparison of the measured concentration of site-related metals in surface water with calculated site-specific health-based criteria). Secondly, semi-quantitative comparative ecology data were used to provide a direct measure of impacts to key species. Finally, data on other factors (e.g., acclimation and tolerance evolution) that may affect the bioavailability and toxicity of site-related metals were also considered. Information from these three sources were used to obtain a realistic picture of actual and potential population- and community-level effects associated with exposure to mining-related metals.  相似文献   

7.
Modeling Microbial Growth Within Food Safety Risk Assessments   总被引:5,自引:0,他引:5  
Risk estimates for food-borne infection will usually depend heavily on numbers of microorganisms present on the food at the time of consumption. As these data are seldom available directly, attention has turned to predictive microbiology as a means of inferring exposure at consumption. Codex guidelines recommend that microbiological risk assessment should explicitly consider the dynamics of microbiological growth, survival, and death in foods. This article describes predictive models and resources for modeling microbial growth in foods, and their utility and limitations in food safety risk assessment. We also aim to identify tools, data, and knowledge sources, and to provide an understanding of the microbial ecology of foods so that users can recognize model limits, avoid modeling unrealistic scenarios, and thus be able to appreciate the levels of confidence they can have in the outputs of predictive microbiology models. The microbial ecology of foods is complex. Developing reliable risk assessments involving microbial growth in foods will require the skills of both microbial ecologists and mathematical modelers. Simplifying assumptions will need to be made, but because of the potential for apparently small errors in growth rate to translate into very large errors in the estimate of risk, the validity of those assumptions should be carefully assessed. Quantitative estimates of absolute microbial risk within narrow confidence intervals do not yet appear to be possible. Nevertheless, the expression of microbial ecology knowledge in "predictive microbiology" models does allow decision support using the tools of risk assessment.  相似文献   

8.
The selection and use of chemicals and materials with less hazardous profiles reflects a paradigm shift from reliance on risk minimization through exposure controls to hazard avoidance. This article introduces risk assessment and alternatives assessment frameworks in order to clarify a misconception that alternatives assessment is a less effective tool to guide decision making, discusses factors promoting the use of each framework, and also identifies how and when application of each framework is most effective. As part of an assessor's decision process to select one framework over the other, it is critical to recognize that each framework is intended to perform different functions. Although the two frameworks share a number of similarities (such as identifying hazards and assessing exposure), an alternatives assessment provides a more realistic framework with which to select environmentally preferable chemicals because of its primary reliance on assessing hazards and secondary reliance on exposure assessment. Relevant to other life cycle impacts, the hazard of a chemical is inherent, and although it may be possible to minimize exposure (and subsequently reduce risk), it is challenging to assess such exposures through a chemical's life cycle. Through increased use of alternatives assessments at the initial stage of material or product design, there will be less reliance on post facto risk‐based assessment techniques because the potential for harm is significantly reduced, if not avoided, negating the need for assessing risk in the first place.  相似文献   

9.
Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values, and risk screening values. We aim to use computational toxicology and quantitative high‐throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene may induce DNA damage and steatosis using qHTS data and two separate adverse outcome pathways (AOPs). We also demonstrate how bootstrap natural spline‐based meta‐regression can be used to integrate data across multiple assay replicates to generate a concentration–response curve. We used this analysis to calculate an in vitro point of departure of 0.751 μM and risk‐specific in vitro concentrations of 0.29 μM and 0.28 μM for 1:1,000 and 1:10,000 risk, respectively, for DNA damage. Based on the available evidence, and considering that only a single HSD17B4 assay is available, we have low overall confidence in the steatosis hazard identification. This case study suggests that coupling qHTS assays with AOPs and ontologies will facilitate hazard identification. Combining this with quantitative evidence integration methods, such as bootstrap meta‐regression, may allow risk assessors to identify points of departure and risk‐specific internal/in vitro concentrations. These results are sufficient to prioritize the chemicals; however, in the longer term we will need to estimate external doses for risk screening purposes, such as through margin of exposure methods.  相似文献   

10.
Siming You  Man Pun Wan 《Risk analysis》2015,35(8):1488-1502
A new risk assessment scheme was developed to quantify the impact of resuspension to infection transmission indoors. Airborne and surface pathogenic particle concentration models including the effect of two major resuspension scenarios (airflow‐induced particle resuspension [AIPR] and walking‐induced particle resuspension [WIPR]) were derived based on two‐compartment mass balance models and validated against experimental data found in the literature. The inhalation exposure to pathogenic particles was estimated using the derived airborne concentration model, and subsequently incorporated into a dose‐response model to assess the infection risk. Using the proposed risk assessment scheme, the influences of resuspension towards indoor infection transmission were examined by two hypothetical case studies. In the case of AIPR, the infection risk increased from 0 to 0.54 during 0–0.5 hours and from 0.54 to 0.57 during 0.5–4 hours. In the case of WIPR, the infection risk increased from 0 to 0.87 during 0–0.5 hours and from 0.87 to 1 during 0.5–4 hours. Sensitivity analysis was conducted based on the design‐of‐experiments method and showed that the factors that are related to the inspiratory rate of viable pathogens and pathogen virulence have the most significant effect on the infection probability under the occurrence of AIPR and WIPR. The risk assessment scheme could serve as an effective tool for the risk assessment of infection transmission indoors.  相似文献   

11.
Listeria monocytogenes is a leading cause of hospitalization, fetal loss, and death due to foodborne illnesses in the United States. A quantitative assessment of the relative risk of listeriosis associated with the consumption of 23 selected categories of ready‐to‐eat foods, published by the U.S. Department of Health and Human Services and the U.S. Department of Agriculture in 2003, has been instrumental in identifying the food products and practices that pose the greatest listeriosis risk and has guided the evaluation of potential intervention strategies. Dose‐response models, which quantify the relationship between an exposure dose and the probability of adverse health outcomes, were essential components of the risk assessment. However, because of data gaps and limitations in the available data and modeling approaches, considerable uncertainty existed. Since publication of the risk assessment, new data have become available for modeling L. monocytogenes dose‐response. At the same time, recent advances in the understanding of L. monocytogenes pathophysiology and strain diversity have warranted a critical reevaluation of the published dose‐response models. To discuss strategies for modeling L. monocytogenes dose‐response, the Interagency Risk Assessment Consortium (IRAC) and the Joint Institute for Food Safety and Applied Nutrition (JIFSAN) held a scientific workshop in 2011 (details available at http://foodrisk.org/irac/events/ ). The main findings of the workshop and the most current and relevant data identified during the workshop are summarized and presented in the context of L. monocytogenes dose‐response. This article also discusses new insights on dose‐response modeling for L. monocytogenes and research opportunities to meet future needs.  相似文献   

12.
Estimation of benchmark doses (BMDs) in quantitative risk assessment traditionally is based upon parametric dose‐response modeling. It is a well‐known concern, however, that if the chosen parametric model is uncertain and/or misspecified, inaccurate and possibly unsafe low‐dose inferences can result. We describe a nonparametric approach for estimating BMDs with quantal‐response data based on an isotonic regression method, and also study use of corresponding, nonparametric, bootstrap‐based confidence limits for the BMD. We explore the confidence limits’ small‐sample properties via a simulation study, and illustrate the calculations with an example from cancer risk assessment. It is seen that this nonparametric approach can provide a useful alternative for BMD estimation when faced with the problem of parametric model uncertainty.  相似文献   

13.
《Risk analysis》2018,38(6):1223-1238
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide‐handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach.  相似文献   

14.
Microbial food safety risk assessment models can often at times be simplified by eliminating the need to integrate a complex dose‐response relationship across a distribution of exposure doses. This is possible if exposure pathways lead to pathogens at exposure that consistently have a small probability of causing illness. In this situation, the probability of illness will follow an approximately linear function of dose. Consequently, the predicted probability of illness per serving across all exposures is linear with respect to the expected value of dose. The majority of dose‐response functions are approximately linear when the dose is low. Nevertheless, what constitutes “low” is dependent on the parameters of the dose‐response function for a particular pathogen. In this study, a method is proposed to determine an upper bound of the exposure distribution for which the use of a linear dose‐response function is acceptable. If this upper bound is substantially larger than the expected value of exposure doses, then a linear approximation for probability of illness is reasonable. If conditions are appropriate for using the linear dose‐response approximation, for example, the expected value for exposure doses is two to three logs10 smaller than the upper bound of the linear portion of the dose‐response function, then predicting the risk‐reducing effectiveness of a proposed policy is trivial. Simple examples illustrate how this approximation can be used to inform policy decisions and improve an analyst's understanding of risk.  相似文献   

15.
《Risk analysis》2018,38(5):1052-1069
This study investigated whether, in the absence of chronic noncancer toxicity data, short‐term noncancer toxicity data can be used to predict chronic toxicity effect levels by focusing on the dose–response relationship instead of a critical effect. Data from National Toxicology Program (NTP) technical reports have been extracted and modeled using the Environmental Protection Agency's Benchmark Dose Software. Best‐fit, minimum benchmark dose (BMD), and benchmark dose lower limits (BMDLs) have been modeled for all NTP pathologist identified significant nonneoplastic lesions, final mean body weight, and mean organ weight of 41 chemicals tested by NTP between 2000 and 2012. Models were then developed at the chemical level using orthogonal regression techniques to predict chronic (two years) noncancer health effect levels using the results of the short‐term (three months) toxicity data. The findings indicate that short‐term animal studies may reasonably provide a quantitative estimate of a chronic BMD or BMDL. This can allow for faster development of human health toxicity values for risk assessment for chemicals that lack chronic toxicity data.  相似文献   

16.
Quantitative Risk Assessment for Developmental Neurotoxic Effects   总被引:4,自引:0,他引:4  
Developmental neurotoxicity concerns the adverse health effects of exogenous agents acting on neurodevelopment. Because human brain development is a delicate process involving many cellular events, the developing fetus is rather susceptible to compounds that can alter the structure and function of the brain. Today, there is clear evidence that early exposure to many neurotoxicants can severely damage the developing nervous system. Although in recent years, there has been much attention given to model development and risk assessment procedures for developmental toxicants, the area of developmental neurotoxicity has been largely ignored. Here, we consider the problem of risk estimation for developmental neurotoxicants from animal bioassay data. Since most responses from developmental neurotoxicity experiments are nonquantal in nature, an adverse health effect will be defined as a response that occurs with very small probability in unexposed animals. Using a two-stage hierarchical normal dose-response model, upper confidence limits on the excess risk due to a given level of added exposure are derived. Equivalently, the model is used to obtain lower confidence limits on dose for a small negligible level of risk. Our method is based on the asymptotic distribution of the likelihood ratio statistic (cf. Crump, 1995). An example is used to provide further illustration.  相似文献   

17.
Risk‐benefit analyses are introduced as a new paradigm for old problems. However, in many cases it is not always necessary to perform a full comprehensive and expensive quantitative risk‐benefit assessment to solve the problem, nor is it always possible, given the lack of required date. The choice to continue from a more qualitative to a full quantitative risk‐benefit assessment can be made using a tiered approach. In this article, this tiered approach for risk‐benefit assessment will be addressed using a decision tree. The tiered approach described uses the same four steps as the risk assessment paradigm: hazard and benefit identification, hazard and benefit characterization, exposure assessment, and risk‐benefit characterization, albeit in a different order. For the purpose of this approach, the exposure assessment has been moved upward and the dose‐response modeling (part of hazard and benefit characterization) is moved to a later stage. The decision tree includes several stop moments, depending on the situation where the gathered information is sufficient to answer the initial risk‐benefit question. The approach has been tested for two food ingredients. The decision tree presented in this article is useful to assist on a case‐by‐case basis a risk‐benefit assessor and policymaker in making informed choices when to stop or continue with a risk‐benefit assessment.  相似文献   

18.
Monte Carlo simulations are commonplace in quantitative risk assessments (QRAs). Designed to propagate the variability and uncertainty associated with each individual exposure input parameter in a quantitative risk assessment, Monte Carlo methods statistically combine the individual parameter distributions to yield a single, overall distribution. Critical to such an assessment is the representativeness of each individual input distribution. The authors performed a literature review to collect and compare the distributions used in published QRAs for the parameters of body weight, food consumption, soil ingestion rates, breathing rates, and fluid intake. To provide a basis for comparison, all estimated exposure parameter distributions were evaluated with respect to four properties: consistency, accuracy, precision, and specificity. The results varied depending on the exposure parameter. Even where extensive, well-collected data exist, investigators used a variety of different distributional shapes to approximate these data. Where such data do not exist, investigators have collected their own data, often leading to substantial disparity in parameter estimates and subsequent choice of distribution. The present findings indicate that more attention must be paid to the data underlying these distributional choices. More emphasis should be placed on sensitivity analyses, quantifying the impact of assumptions, and on discussion of sources of variation as part of the presentation of any risk assessment results. If such practices and disclosures are followed, it is believed that Monte Carlo simulations can greatly enhance the accuracy and appropriateness of specific risk assessments. Without such disclosures, researchers will be increasing the size of the risk assessment "black box," a concern already raised by many critics of more traditional risk assessments.  相似文献   

19.
Methyl tert-butyl ether (MTBE) was added to gasoline in New Hampshire (NH) between 1995 and 2006 to comply with the oxygenate requirements of the 1990 Amendments to the Clean Air Act. Leaking tanks and spills released MTBE into groundwater, and as a result, MTBE has been detected in drinking water in NH. We conducted a comparative cancer risk assessment and a margin-of-safety (MOS) analysis for several constituents, including MTBE, detected in NH drinking water. Using standard risk assessment methods, we calculated cancer risks from exposure to 12 detected volatile organic compounds (VOCs), including MTBE, and to four naturally occurring compounds (i.e., arsenic, radium-226, radium-228, and radon-222) detected in NH public water supplies. We evaluated exposures to a hypothetical resident ingesting the water, dermally contacting the water while showering, and inhaling compounds volatilizing from water in the home. We then compared risk estimates for MTBE to those of the other 15 compounds. From our analysis, we concluded that the high-end cancer risk from exposure to MTBE in drinking water is lower than the risks from all the other VOCs evaluated and several thousand times lower than the risks from exposure to naturally occurring constituents, including arsenic, radium, and radon. We also conducted an MOS analysis in which we compared toxicological points of departure to the NH maximum contaminant level (MCL) of 13 µg/L. All of the MOSs were greater than or equal to 160,000, indicating a large margin of safety and demonstrating the health-protectiveness of the NH MCL for MTBE.  相似文献   

20.
Nanotechnology is a broad term that encompasses materials, structures, or processes that utilize engineered nanomaterials, which can be defined as materials intentionally designed to have one or more dimensions between 1 and 100 nm. Historically, risk characterization has been viewed as the final phase of a risk assessment process that integrates hazard identification, dose‐response assessment, and exposure assessment. The novelty and diversity of materials, structures, and tools that are covered by above‐defined “nanotechnology” raise substantial methodological issues and pose significant challenges for each of these phases of risk assessment. These issues and challenges culminate in the risk characterization phase of the risk assessment process, and this article discusses several of these key issues and approaches to developing risk characterization results and their implications for risk management decision making that are specific to nanotechnology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号