首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 483 毫秒
1.
Multivariate dose-response models have recently been proposed for developmental toxicity data to simultaneously model malformation incidence (a binary outcome), and reductions in fetal weight (a continuous outcome). In this and other applications, the binary outcome often represents a dichotomization of another outcome or a composite of outcomes, which facilitates analysis. For example, in Segment II developmental toxicology studies, multiple malformation types (i.e., external, visceral, skeletal) are evaluated on each fetus; malformation status may also be ordinally measured (e.g., normal, signs of variation, full malformation). A model is proposed is for fetal weight and multiple malformation variables measured on an ordinal scale, where the correlations between the outcomes and between the offspring within a litter are taken into account. Fully specifying the joint distribution of outcomes within a litter is avoided by specifying only the distribution of the multivariate outcome for each fetus and using generalized estimating equation methodology to account for correlations due to litter clustering. The correlations between the outcomes are required to characterize joint risk to the fetus, and are therefore a focus of inference. Dose-response models and their application to quantitative risk assessment are illustrated using data from a recent developmental toxicology experiment of ethylene oxide in mice.  相似文献   

2.
Toxicologists are often interested in assessing the joint effect of an exposure on multiple reproductive endpoints, including early loss, fetal death, and malformation. Exposures that occur prior to mating or extremely early in development can adversely affect the number of implantation sites or fetuses that form within each dam and may even prevent pregnancy. A simple approach for assessing overall adverse effects in such studies is to consider fetuses or implants that fail to develop due to exposure as missing data. The missing data can be imputed, and standard methods for the analysis of quantal response data can then be used for quantitative risk assessment or testing. In this article, a new bias-corrected imputation procedure is proposed and evaluated. The procedure is straightforward to implement in standard statistical packages and has excellent operating characteristics when used in combination with a marginal model fit with generalized estimating equations. The methods are applied to data from a reproductive toxicity study of Nitrofurazone conducted by the National Toxicology Program.  相似文献   

3.
Ethylene oxide is a gas produced in large quantities in the United States that is used primarily as a chemical intermediate in the production of ethylene glycol, propylene glycol, non-ionic surfactants, ethanolamines, glycol ethers, and other chemicals. It has been well established that ethylene oxide can induce cancer, genetic, reproductive and developmental, and acute health effects in animals. The U.S. Environmental Protection Agency is currently developing both a cancer potency factor and a reference concentration (RfC) for ethylene oxide. This study used the rich database on the reproductive and developmental effects of ethylene oxide to develop a probabilistic characterization of possible regulatory thresholds for ethylene oxide. This analysis was based on the standard regulatory approach for noncancer risk assessment, but involved several innovative elements, such as: (1) the use of advanced statistical methods to account for correlations in developmental outcomes among littermates and allow for simultaneous control of covariates (such as litter size); (2) the application of a probabilistic approach for characterizing the uncertainty in extrapolating the animal results to humans; and (3) the use of a quantitative approach to account for the variation in heterogeneity among the human population. This article presents several classes of results, including: (1) probabilistic characterizations of ED10s for two quantal reproductive outcomes-resorption and fetal death, (2) probabilistic characterizations of one developmental outcome-the dose expected to yield a 5% reduction in fetal (or pup) weight, (3) estimates of the RfCs that would result from using these values in the standard regulatory approach for noncancer risk assessment, and (4) a probabilistic characterization of the level of ethylene oxide exposure that would be expected to yield a 1/1,000 increase in the risk of reproductive or developmental outcomes in exposed human populations.  相似文献   

4.
Public health concerns over the occurrence of developmental abnormalities that can occur as a result of prenatal exposure to drugs, chemicals, and other environmental factors has led to a number of developmental toxicity studies and the use of the benchmark dose (BMD) for risk assessment. To characterize risk from multiple sources, more recent analytic methods involve a joint modeling approach, accounting for multiple dichotomous and continuous outcomes. For some continuous outcomes, evaluating all subjects may not be feasible, and only a subset may be evaluated due to limited resources. The subset can be selected according to a prespecified probability model and the unobserved data can be viewed as intentionally missing in the sense that subset selection results in missingness that is experimentally planned. We describe a subset selection model that allows for sampling pups with malformations and healthy pups at different rates, and includes the well‐known simple random sample (SRS) as a special case. We were interested in understanding how sampling rates that are selected beforehand influence the precision of the BMD. Using simulations we show how improvements over the SRS can be obtained by oversampling malformations, and how some sampling rates can yield precision that is substantially worse than the SRS. We also illustrate the potential for cost saving with oversampling. Simulations are based on a joint mixed effects model, and to account for subset selection, use of case weights to obtain valid dose‐response estimates.  相似文献   

5.
The awareness of potential risks emerging from the use of chemicals in all parts of daily life has increased the need for risk assessments that are able to cover a high number of exposure situations and thereby ensure the safety of workers and consumers. In the European Union (EU), the practice of risk assessments for chemicals is laid down in a Technical Guidance Document; it is designed to consider environmental and human occupational and residential exposure. Almost 70 EU risk assessment reports (RARs) have been finalized for high-production-volume chemicals during the last decade. In the present study, we analyze the assessment of occupational and consumer exposure to trichloroethylene and phthalates presented in six EU RARs. Exposure scenarios in these six RARs were compared to scenarios used in applications of the scenario-based risk assessment approach to the same set of chemicals. We find that scenarios used in the selected EU RARs to represent typical exposure situations in occupational or private use of chemicals and products do not necessarily represent worst-case conditions. This can be due to the use of outdated information on technical equipment and conditions in workplaces or omission of pathways that can cause consumer exposure. Considering the need for exposure and risk assessments under the new chemicals legislation of the EU, we suggest that a transparent process of collecting data on exposure situations and of generating representative exposure scenarios is implemented to improve the accuracy of risk assessments. Also, the data sets used to assess human exposure should be harmonized, summarized in a transparent fashion, and made accessible for all risk assessors and the public.  相似文献   

6.
Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard “point” risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.  相似文献   

7.
The purpose of this article is to quantify the public health risk associated with inhalation of indoor airborne infection based on a probabilistic transmission dynamic modeling approach. We used the Wells-Riley mathematical model to estimate (1) the CO2 exposure concentrations in indoor environments where cases of inhalation airborne infection occurred based on reported epidemiological data and epidemic curves for influenza and severe acute respiratory syndrome (SARS), (2) the basic reproductive number, R0 (i.e., expected number of secondary cases on the introduction of a single infected individual in a completely susceptible population) and its variability in a shared indoor airspace, and (3) the risk for infection in various scenarios of exposure in a susceptible population for a range of R0. We also employ a standard susceptible-infectious-recovered (SIR) structure to relate Wells-Riley model derived R0 to a transmission parameter to implicate the relationships between indoor carbon dioxide concentration and contact rate. We estimate that a single case of SARS will infect 2.6 secondary cases on average in a population from nosocomial transmission, whereas less than 1 secondary infection was generated per case among school children. We also obtained an estimate of the basic reproductive number for influenza in a commercial airliner: the median value is 10.4. We suggest that improving the building air cleaning rate to lower the critical rebreathed fraction of indoor air can decrease transmission rate. Here, we show that virulence of the organism factors, infectious quantum generation rates (quanta/s by an infected person), and host factors determine the risk for inhalation of indoor airborne infection.  相似文献   

8.
A challenge with multiple chemical risk assessment is the need to consider the joint behavior of chemicals in mixtures. To address this need, pharmacologists and toxicologists have developed methods over the years to evaluate and test chemical interaction. In practice, however, testing of chemical interaction more often comprises ad hoc binary combinations and rarely examines higher order combinations. One explanation for this practice is the belief that there are simply too many possible combinations of chemicals to consider. Indeed, under stochastic conditions the possible number of chemical combinations scales geometrically as the pool of chemicals increases. However, the occurrence of chemicals in the environment is determined by factors, economic in part, which favor some chemicals over others. We investigate methods from the field of biogeography, originally developed to study avian species co‐occurrence patterns, and adapt these approaches to examine chemical co‐occurrence. These methods were applied to a national survey of pesticide residues in 168 child care centers from across the country. Our findings show that pesticide co‐occurrence in the child care center was not random but highly structured, leading to the co‐occurrence of specific pesticide combinations. Thus, ecological studies of species co‐occurrence parallel the issue of chemical co‐occurrence at specific locations. Both are driven by processes that introduce structure in the pattern of co‐occurrence. We conclude that the biogeographical tools used to determine when this structure occurs in ecological studies are relevant to evaluations of pesticide mixtures for exposure and risk assessment.  相似文献   

9.
James Chen 《Risk analysis》1993,13(5):559-564
A dose-response model is often fit to bioassay data to provide a mathematical relationship between the incidence of a developmental malformation and dose of a toxicant. To utilize the interrelations among the fetal weight, incidence of malformation and number of the live fetuses, a conditional Gaussian regression chain model is proposed to model the dose-response function for developmental malformation incidence using the litter size and/or the fetal weight as covariates. The litter size is modeled as a function of dose, the fetal weight is modeled as a function of dose conditional on the litter size, and the malformation incidence is modeled as a function of dose conditional on both the litter size and the fetal weight, which itself is also conditional on the litter size. Data from a developmental experiment conducted at the National Center for Toxicological Research to investigate the growth stunting and increased incidence of cleft palate induced by Dexamethasone (DEX) exposure in rats was used as an illustration.  相似文献   

10.
Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values, and risk screening values. We aim to use computational toxicology and quantitative high‐throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene may induce DNA damage and steatosis using qHTS data and two separate adverse outcome pathways (AOPs). We also demonstrate how bootstrap natural spline‐based meta‐regression can be used to integrate data across multiple assay replicates to generate a concentration–response curve. We used this analysis to calculate an in vitro point of departure of 0.751 μM and risk‐specific in vitro concentrations of 0.29 μM and 0.28 μM for 1:1,000 and 1:10,000 risk, respectively, for DNA damage. Based on the available evidence, and considering that only a single HSD17B4 assay is available, we have low overall confidence in the steatosis hazard identification. This case study suggests that coupling qHTS assays with AOPs and ontologies will facilitate hazard identification. Combining this with quantitative evidence integration methods, such as bootstrap meta‐regression, may allow risk assessors to identify points of departure and risk‐specific internal/in vitro concentrations. These results are sufficient to prioritize the chemicals; however, in the longer term we will need to estimate external doses for risk screening purposes, such as through margin of exposure methods.  相似文献   

11.
Richard A. Canady 《Risk analysis》2010,30(11):1663-1670
A September 2008 workshop sponsored by the Society for Risk Analysis( 1 ) on risk assessment methods for nanoscale materials explored “nanotoxicology” in risk assessment. A general conclusion of the workshop was that, while research indicates that some nanoscale materials are toxic, the information presented at the workshop does not indicate the need for a conceptually different approach for risk assessment on nanoscale materials, compared to other materials. However, the toxicology discussions did identify areas of uncertainty that present a challenge for the assessment of nanoscale materials. These areas include novel metrics, characterizing multivariate dynamic mixtures, identification of toxicologically relevant properties and “impurities” for nanoscale characteristics, and characterizing persistence, toxicokinetics, and weight of evidence in consideration of the dynamic nature of the mixtures. The discussion also considered “nanomaterial uncertainty factors” for health risk values like the Environmental Protection Agency's reference dose (RfD). Similar to the general opinions for risk assessment, participants expressed that completing a data set regarding toxicity, or extrapolation between species, sensitive individuals, or durations of exposure, were not qualitatively different considerations for nanoscale materials in comparison to all chemicals, and therefore, a “nanomaterial uncertainty factor” for all nanomaterials does not seem appropriate. However, the quantitative challenges may require new methods and approaches to integrate the information and the uncertainty.  相似文献   

12.
A conceptual framework is presented for conducting exposure assessments under the U.S. EPA's Voluntary Children's Chemical Evaluation Program (VCCEP). The VCCEP is a voluntary program whereby companies that manufacture chemicals of potential concern are asked to conduct hazard, exposure, and risk assessments for the chemicals. The VCCEP is unique in its risk-based, tiered approach, and because it focuses on children and requires a comprehensive consideration of all reasonably foreseeable exposure pathways for a particular chemical. The consideration of all potential exposure pathways for some commonly used chemicals presents a daunting challenge for the exposure assessor. This article presents a framework for managing this complicated process, and illustrates the application of the framework with a hypothetical case study. The framework provides guidance for interpreting multiple sources of exposure information and developing a plausible list of exposure pathways for a chemical. Furthermore, the framework provides a means to process all the available information to eliminate pathways of negligible concern from consideration. Finally, the framework provides guidance for utilizing the tiered approach of VCCEP to efficiently conduct an assessment by first using simple, screening-level approaches and then, if necessary, using more complex, refined exposure assessment methods. The case study provides an illustration of the major concepts.  相似文献   

13.
Estimates have been made of the cancer potency of aflatoxin exposure among the U.S. population. Risk modeling is used to assess the dose-response relationship between aflatoxin exposure and primary liver cancer, controlling for hepatitis B virus (HBV), based on data provided by the Yeh et al. study in China. A relative risk model is proposed as a more appropriate alternative to the additive ("absolute" risk) model for transportation of risk coefficients between populations with different baseline rates. Several general relative risk models were examined; the exponential model provided the best fit. The Poisson regression method was used to fit the relative risk model to the grouped data. The effects of exposure to aflatoxin (AFB1) and hepatitis B infection were both found to be statistically significant. The risk of death from liver cancer for those exposed to AFB1 relative to the unexposed population, increases by 0.05% per ng/kg/day exposure of AFB1 (p less than 0.001). The results also indicated a 25-fold increase in the risk of death from liver cancer among those infected with hepatitis B virus, relative to noncarriers (p less than 0.0001). With a hepatitis prevalence rate of 1%, the aflatoxin intake level associated with liver cancer lifetime excess risk of 1 x 10(-5) for the U.S. population was estimated as 253 ng/day, based on a liver cancer baseline rate of 3.4/100,000/yr.  相似文献   

14.
For the vast majority of chemicals that have cancer potency estimates on IRIS, the underlying database is deficient with respect to early-life exposures. This data gap has prevented derivation of cancer potency factors that are relevant to this time period, and so assessments may not fully address children's risks. This article provides a review of juvenile animal bioassay data in comparison to adult animal data for a broad array of carcinogens. This comparison indicates that short-term exposures in early life are likely to yield a greater tumor response than short-term exposures in adults, but similar tumor response when compared to long-term exposures in adults. This evidence is brought into a risk assessment context by proposing an approach that: (1) does not prorate children's exposures over the entire life span or mix them with exposures that occur at other ages; (2) applies the cancer slope factor from adult animal or human epidemiology studies to the children's exposure dose to calculate the cancer risk associated with the early-life period; and (3) adds the cancer risk for young children to that for older children/adults to yield a total lifetime cancer risk. The proposed approach allows for the unique exposure and pharmacokinetic factors associated with young children to be fully weighted in the cancer risk assessment. It is very similar to the approach currently used by U.S. EPA for vinyl chloride. The current analysis finds that the database of early life and adult cancer bioassays supports extension of this approach from vinyl chloride to other carcinogens of diverse mode of action. This approach should be enhanced by early-life data specific to the particular carcinogen under analysis whenever possible.  相似文献   

15.
A new mathematical dose-response model for reproductive and developmental risk assessment is proposed. The model includes the possibility of an exposure threshold as well as a litter-size effect. Correlation of responses of offspring from the same litter is taken into account through the use of the beta-binomial distribution. Confidence limits for low-dose extrapolation are based on the asymptotic distribution of the likelihood ratio. An empirical comparison of the proposed procedure to that of Rai and Van Ryzin demonstrates the improvement that can be achieved with the new procedure.  相似文献   

16.
In December 2000 the EPA initiated the Voluntary Children's Chemical Evaluation Program (VCCEP) by asking manufacturers to voluntarily sponsor toxicological testing in a tiered process for 23 chemicals selected for the pilot phase. The tiered nature of the VCCEP pilot program creates the need for clearly defined criteria for determining when information is sufficient to assess the potential risks to children. This raises questions about how to determine the "adequacy" of the existing information and assess the need to undertake efforts to reduce uncertainty (through further testing). This article applies a value of information analysis approach to determine adequacy by modeling how toxicological and exposure data collected through the VCCEP may be used to inform risk management decisions. The analysis demonstrates the importance of information about the exposure level and control costs in making decisions regarding further toxicological testing. This article accounts for the cost of delaying control action and identifies the optimal testing strategy for a constrained decisionmaker who, absent applicable human data, cannot regulate without bioassay data on a specific chemical. It also quantifies the differences in optimal testing strategy for three decision criteria: maximizing societal net benefits, ensuring maximum exposure control while net benefits are positive (i.e., benefits outweigh costs), and controlling to the maximum extent technologically feasible while the lifetime risk of cancer exceeds a specific level of risk. Finally, this article shows the large differences that exist in net benefits between the three criteria for the range of exposure levels where the optimal actions differ.  相似文献   

17.
We review approaches for characterizing “peak” exposures in epidemiologic studies and methods for incorporating peak exposure metrics in dose–response assessments that contribute to risk assessment. The focus was on potential etiologic relations between environmental chemical exposures and cancer risks. We searched the epidemiologic literature on environmental chemicals classified as carcinogens in which cancer risks were described in relation to “peak” exposures. These articles were evaluated to identify some of the challenges associated with defining and describing cancer risks in relation to peak exposures. We found that definitions of peak exposure varied considerably across studies. Of nine chemical agents included in our review of peak exposure, six had epidemiologic data used by the U.S. Environmental Protection Agency (US EPA) in dose–response assessments to derive inhalation unit risk values. These were benzene, formaldehyde, styrene, trichloroethylene, acrylonitrile, and ethylene oxide. All derived unit risks relied on cumulative exposure for dose–response estimation and none, to our knowledge, considered peak exposure metrics. This is not surprising, given the historical linear no‐threshold default model (generally based on cumulative exposure) used in regulatory risk assessments. With newly proposed US EPA rule language, fuller consideration of alternative exposure and dose–response metrics will be supported. “Peak” exposure has not been consistently defined and rarely has been evaluated in epidemiologic studies of cancer risks. We recommend developing uniform definitions of “peak” exposure to facilitate fuller evaluation of dose response for environmental chemicals and cancer risks, especially where mechanistic understanding indicates that the dose response is unlikely linear and that short‐term high‐intensity exposures increase risk.  相似文献   

18.
An international consensus on the need to reduce the use of chlorofluorocarbons (CFCs) and other ozone-depleting gases such as the halons led to the adoptions of the 1987 Montreal Protocol and Title VI of the 1990 Clean Air Act Amendments, "Protecting Stratospheric Ozone." These agreements included major provisions for reducing and eventually phasing out production and use of CFCs and halons as well as advancing the development of replacement chemicals. Because of the ubiquitous use and benefits of CFCs and halons, an expeditious search for safe replacements to meet the legislative deadlines is of critical importance. Toxicity testing and health risk assessment programs were established to evaluate the health and environmental impact of these replacement chemicals. Development and implementation of these programs as well as the structural-activity relationships significant for the development of the replacement chemicals are described below. A dose-response evaluation for the health risk assessment of the replacement chemical HCFC-123 (2,2-dichloro-1,1,1-trifluoroethane) is also presented to show an innovative use of physiologically based pharmacokinetic (PBPK) modeling. This is based on a parallelogram approach using data on the anesthetic gas halothane, a structural analog to HCFC-123. Halothane and HCFC-123 both form the same metabolite, trifluoroacetic acid (TFA), indicative of the same metabolic oxidative pathway attributed to hepatotoxicity. The parallelogram approach demonstrates the application of template model structures and shows how PBPK modeling, together with judicious experimental design, can be used to improve the accuracy of health risk assessment and to decrease the need for extensive laboratory animal testing.  相似文献   

19.
This study is a replication and extension in Canada of a previous study in the United States in which toxicologists and members of the public were surveyed to determine their attitudes, beliefs, and perceptions regarding risks from chemicals. This study of "intuitive vs. scientific toxicology" was motivated by the premise that different assumptions, conceptions, and values underlie much of the discrepancy between expert and lay views of chemical risks. The results showed that Canadian toxicologists had far lower perceptions of risk and more favorable attitudes toward chemicals than did the Canadian public. The public's attitudes were quite negative and showed the same lack of dose-response sensitivity found in the earlier U.S. study. Both the public and the toxicologists lacked confidence in the value of animal studies for predicting human health risks. However, the public had great confidence in the validity of animal studies that found evidence of carcinogenicity, whereas such evidence was not considered highly predictive of human health risk by many toxicologists. Technical judgments of toxicologists were found to be associated with factors such as affiliation, gender, and worldviews. Implications of these data for risk communication are briefly discussed.  相似文献   

20.
Louise Ryan 《Risk analysis》1992,12(3):439-447
This paper reviews and compares several approaches to fitting dose-response models to developmental toxicity data. The main issue of interest is how to appropriately account for litter effects. Among the approaches reviewed are Beta Binomial models, models that attempt to characterize the litter effect through the use of covariates, and models that avoid the complication of correlated offspring by modeling "affected litter" rather than fetus-specific outcomes. Finally, we discuss our recommended approach, which is to use Generalized Estimating Equations, or quasi-likelihood. We give a number of reasons for preferring the latter and illustrate its application with an example.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号