共查询到20条相似文献,搜索用时 15 毫秒
1.
Ioannis A. Papazoglou Olga Aneziris Linda Bellamy B. J. M. Ale Joy I. H. Oh 《Risk analysis》2015,35(8):1536-1561
Occupational risk rates per hour of exposure have been quantified for 63 occupational accident types for the Dutch working population. Data were obtained from the analysis of more than 9,000 accidents that occurred over a period of six years in the Netherlands and resulted in three types of reportable consequences under Dutch law: (a) fatal injury, (b) permanent injury, and (c) serious recoverable injury requiring at least one day of hospitalization. A Bayesian uncertainty assessment on the value of the risk rates has been performed. Annual risks for each of the 63 occupational accident types have been calculated, including the variability in the annual exposure of the working population to the corresponding hazards. The suitability of three risk measures—individual risk rates, individual annual risk, and number of accidents—is examined and discussed. 相似文献
2.
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation‐based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source‐to‐source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. 相似文献
3.
Loup Rimbaud Fanny Heraud Sébastien La Vieille Jean-Charles Leblanc Amélie Crepet 《Risk analysis》2010,30(1):7-19
Peanut allergy is a public health concern, owing to the high prevalence in France and the severity of the reactions. Despite peanut-containing product avoidance diets, a risk may exist due to the adventitious presence of peanut allergens in a wide range of food products. Peanut is not mentioned in their ingredients list, but precautionary labeling is often present. A method of quantifying the risk of allergic reactions following the consumption of such products is developed, taking the example of peanut in chocolate tablets. The occurrence of adventitious peanut proteins in chocolate and the dose-response relationship are estimated with a Bayesian approach using available published data. The consumption pattern is described by the French individual consumption survey INCA2. Risk simulations are performed using second-order Monte Carlo simulations, which separately propagates variability and uncertainty of the model input variables. Peanut allergens occur in approximately 36% of the chocolates, leading to a mean exposure level of 0.2 mg of peanut proteins per eating occasion. The estimated risk of reaction averages 0.57% per eating occasion for peanut-allergic adults. The 95% values of the risk stand between 0 and 3.61%, which illustrates the risk variability. The uncertainty, represented by the 95% credible intervals, is concentrated around these risk estimates. Children have similar results. The conclusion is that adventitious peanut allergens induce a risk of reaction for a part of the French peanut-allergic population. The method developed can be generalized to assess the risk due to the consumption of every foodstuff potentially contaminated by allergens. 相似文献
4.
Ana Rita Salgueiro Henrique Garcia Pereira Maria‐Teresa Rico Gerado Benito Andrés Díez‐Herreo 《Risk analysis》2008,28(1):13-23
A new statistical approach for preliminary risk evaluation of breakage in tailings dam is presented and illustrated by a case study regarding the Mediterranean region. The objective of the proposed method is to establish an empirical scale of risk, from which guidelines for prioritizing the collection of further specific information can be derived. The method relies on a historical database containing, in essence, two sets of qualitative data: the first set concerns the variables that are observable before the disaster (e.g., type and size of the dam, its location, and state of activity), and the second refers to the consequences of the disaster (e.g., failure type, sludge characteristics, fatalities categorization, and downstream range of damage). Based on a modified form of correspondence analysis, where the second set of attributes are projected as "supplementary variables" onto the axes provided by the eigenvalue decomposition of the matrix referring to the first set, a "qualitative regression" is performed, relating the variables to be predicted (contained in the second set) with the "predictors" (the observable variables). On the grounds of the previously derived relationship, the risk of breakage in a new case can be evaluated, given observable variables. The method was applied in a case study regarding a set of 13 test sites where the ranking of risk obtained was validated by expert knowledge. Once validated, the procedure was included in the final output of the e-EcoRisk UE project (A Regional Enterprise Network Decision-Support System for Environmental Risk and Disaster Management of Large-Scale Industrial Spills), allowing for a dynamic historical database updating and providing a prompt rough risk evaluation for a new case. The aim of this section of the global project is to provide a quantified context where failure cases occurred in the past for supporting analogue reasoning in preventing similar situations. 相似文献
5.
Updating Uncertainty in an Integrated Risk Assessment: Conceptual Framework and Methods 总被引:1,自引:0,他引:1
Bayesian methods are presented for updating the uncertainty in the predictions of an integrated Environmental Health Risk Assessment (EHRA) model. The methods allow the estimation of posterior uncertainty distributions based on the observation of different model outputs along the chain of the linked assessment framework. Analytical equations are derived for the case of the multiplicative lognormal risk model where the sequential log outputs (log ambient concentration, log applied dose, log delivered dose, and log risk) are each normally distributed. Given observations of a log output made with a normally distributed measurement error, the posterior distributions of the log outputs remain normal, but with modified means and variances, and induced correlations between successive log outputs and log inputs. The analytical equations for forward and backward propagation of the updates are generally applicable to sums of normally distributed variables. The Bayesian Monte-Carlo (BMC) procedure is presented to provide an approximate, but more broadly applicable method for numerically updating uncertainty with concurrent backward and forward propagation. Illustrative examples, presented for the multiplicative lognormal model, demonstrate agreement between the analytical and BMC methods, and show how uncertainty updates can propagate through a linked EHRA. The Bayesian updating methods facilitate the pooling of knowledge encoded in predictive models with that transmitted by research outcomes (e.g., field measurements), and thereby support the practice of iterative risk assessment and value of information appraisals. 相似文献
6.
This article presents a discourse on the incorporation of organizational factors into probabilistic risk assessment (PRA)/probabilistic safety assessment (PSA), a topic of debate since the 1980s that has spurred discussions among industry, regulatory agencies, and the research community. The main contributions of this article include (1) identifying the four key open questions associated with this topic; (2) framing ongoing debates by considering differing perspectives around each question; (3) offering a categorical review of existing studies on this topic to justify the selection of each question and to analyze the challenges related to each perspective; and (4) highlighting the directions of research required to reach a final resolution for each question. The four key questions are: (I) How significant is the contribution of organizational factors to accidents and incidents? (II) How critical, with respect to improving risk assessment, is the explicit incorporation of organizational factors into PRA? (III) What theoretical bases are needed for explicit incorporation of organizational factors into PRA? (IV) What methodological bases are needed for the explicit incorporation of organizational factors into PRA? Questions I and II mainly analyze PRA literature from the nuclear domain. For Questions III and IV, a broader review and categorization is conducted of those existing cross-disciplinary studies that have evaluated the effects of organizational factors on safety (not solely PRA-based) to shed more light on future research needs. 相似文献
7.
A predictive case-cohort model is applied to Norwegian data to analyze the interaction between challenge and stability factors for bovine spongiform encephalopathy (BSE) during the period 1980-2010. For each year, the BSE risk in cattle is estimated as the expected number of cases. The age distribution of expected cases as well as the relative impact of different challenges is estimated. The model consists of a simple, transparent, and practical deterministic spreadsheet calculation model, in which the following country-specific inputs are entered: (i) annual imports of live cattle and meat and bone meal, (ii) age distribution of native cattle, and (iii) estimated annual basic reproduction ratio (R(0)) for BSE. Results for Norway indicate that the highest risk of BSE cases was in 1989, when a total BSE risk of 0.13 cases per year was expected. After that date, the year-to-year decrease in risk ranged between 3% and 47%, except for a secondary peak in 1994 at 0.06 cases per year. The primary peak was almost entirely (99%) attributable to the importation of 11 cattle from the United Kingdom between 1982 and 1986. The secondary peak, in 1994, originated mainly from the recycling of the U.K. imported cattle (92%). In 2006, the remaining risk was 0.0003 cases per year, or 0.001 per million cows per year, with a maximal age-specific incidence of 0.03 cases per million per year in 10-year-old cattle. Only 15% of the cases were expected in imported cattle. The probability of having zero cases in Norway in 2006 was estimated to be 99.97%. The model and results are compared to previous risk assessments of Norway by the EU. 相似文献
8.
Klaus Schneeberger Matthias Huttenlau Benjamin Winter Thomas Steinberger Stefan Achleitner Johann Sttter 《Risk analysis》2019,39(1):125-139
This article presents a flood risk analysis model that considers the spatially heterogeneous nature of flood events. The basic concept of this approach is to generate a large sample of flood events that can be regarded as temporal extrapolation of flood events. These are combined with cumulative flood impact indicators, such as building damages, to finally derive time series of damages for risk estimation. Therefore, a multivariate modeling procedure that is able to take into account the spatial characteristics of flooding, the regionalization method top‐kriging, and three different impact indicators are combined in a model chain. Eventually, the expected annual flood impact (e.g., expected annual damages) and the flood impact associated with a low probability of occurrence are determined for a study area. The risk model has the potential to augment the understanding of flood risk in a region and thereby contribute to enhanced risk management of, for example, risk analysts and policymakers or insurance companies. The modeling framework was successfully applied in a proof‐of‐concept exercise in Vorarlberg (Austria). The results of the case study show that risk analysis has to be based on spatially heterogeneous flood events in order to estimate flood risk adequately. 相似文献
9.
Justin Pence Ian Miller Tatsuya Sakurahara James Whitacre Seyed Reihani Ernie Kee Zahra Mohaghegh 《Risk analysis》2019,39(6):1262-1280
In the nuclear power industry, Level 3 probabilistic risk assessment (PRA) is used to estimate damage to public health and the environment if a severe accident leads to large radiological release. Current Level 3 PRA does not have an explicit inclusion of social factors and, therefore, it is not possible to perform importance ranking of social factors for risk‐informing emergency preparedness, planning, and response (EPPR). This article offers a methodology for adapting the concept of social vulnerability, commonly used in natural hazard research, in the context of a severe nuclear power plant accident. The methodology has four steps: (1) calculating a hazard‐independent social vulnerability index for the local population; (2) developing a location‐specific representation of the maximum radiological hazard estimated from current Level 3 PRA, in a geographic information system (GIS) environment; (3) developing a GIS‐based socio‐technical risk map by combining the social vulnerability index and the location‐specific radiological hazard; and (4) conducting a risk importance measure analysis to rank the criticality of social factors based on their contribution to the socio‐technical risk. The methodology is applied using results from the 2012 Surry Power Station state‐of‐the‐art reactor consequence analysis. A radiological hazard model is generated from MELCOR accident consequence code system, translated into a GIS environment, and combined with the Center for Disease Control social vulnerability index (SVI). This research creates an opportunity to explicitly consider and rank the criticality of location‐specific SVI themes based on their influence on risk, providing input for EPPR. 相似文献
10.
Presenting Uncertainty in Health Risk Assessment: Initial Studies of Its Effects on Risk Perception and Trust 总被引:4,自引:0,他引:4
Some analysts suggest that discussing uncertainties in health risk assessments might reduce citizens'perceptions of risk and increase their respect for the risk-assessing agency. We tested this assumption with simulated news stories varying simple displays of uncertainty (e.g., a range of risk estimates, with and without graphics). Subjects from Eugene, Oregon, read one story each, and then answered a questionnaire. Three studies tested between 180 and 272 subjects each. Two focus groups obtained more detailed responses to these stories. The results suggested that (1) people are unfamiliar with uncertainty in risk assessments and in science; (2) people may recognize uncertainty when it is presented simply; (3) graphics may help people recognize uncertainty; (4) reactions to the environmental problems in the stories seemed affected less by presentation of uncertainty than by general risk attitudes and perceptions; (5) agency discussion of uncertainty in risk estimates may signal agency honesty and agency incompetence for some people; and (6) people seem to see lower risk estimates (10-6 , as opposed to 10-3 ) as less credible. These findings, if confirmed, would have important implications for risk communication. 相似文献
11.
Mitchell J. Small 《Risk analysis》2008,28(5):1289-1308
The distributional approach for uncertainty analysis in cancer risk assessment is reviewed and extended. The method considers a combination of bioassay study results, targeted experiments, and expert judgment regarding biological mechanisms to predict a probability distribution for uncertain cancer risks. Probabilities are assigned to alternative model components, including the determination of human carcinogenicity, mode of action, the dosimetry measure for exposure, the mathematical form of the dose‐response relationship, the experimental data set(s) used to fit the relationship, and the formula used for interspecies extrapolation. Alternative software platforms for implementing the method are considered, including Bayesian belief networks (BBNs) that facilitate assignment of prior probabilities, specification of relationships among model components, and identification of all output nodes on the probability tree. The method is demonstrated using the application of Evans, Sielken, and co‐workers for predicting cancer risk from formaldehyde inhalation exposure. Uncertainty distributions are derived for maximum likelihood estimate (MLE) and 95th percentile upper confidence limit (UCL) unit cancer risk estimates, and the effects of resolving selected model uncertainties on these distributions are demonstrated, considering both perfect and partial information for these model components. A method for synthesizing the results of multiple mechanistic studies is introduced, considering the assessed sensitivities and selectivities of the studies for their targeted effects. A highly simplified example is presented illustrating assessment of genotoxicity based on studies of DNA damage response caused by naphthalene and its metabolites. The approach can provide a formal mechanism for synthesizing multiple sources of information using a transparent and replicable weight‐of‐evidence procedure. 相似文献
12.
Jan F. Van Impe 《Risk analysis》2011,31(8):1295-1307
The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo‐randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used—that is, an ANOVA‐like model and Sobol sensitivity indices—to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats. 相似文献
13.
Quantitative Risk Assessment of Norovirus Transmission in Food Establishments: Evaluating the Impact of Intervention Strategies and Food Employee Behavior on the Risk Associated with Norovirus in Foods 下载免费PDF全文
Steven Duret Régis Pouillot Wendy Fanaselle Efstathia Papafragkou Girvin Liggans Laurie Williams Jane M. Van Doren 《Risk analysis》2017,37(11):2080-2106
We developed a quantitative risk assessment model using a discrete event framework to quantify and study the risk associated with norovirus transmission to consumers through food contaminated by infected food employees in a retail food setting. This study focused on the impact of ill food workers experiencing symptoms of diarrhea and vomiting and potential control measures for the transmission of norovirus to foods. The model examined the behavior of food employees regarding exclusion from work while ill and after symptom resolution and preventive measures limiting food contamination during preparation. The mean numbers of infected customers estimated for 21 scenarios were compared to the estimate for a baseline scenario representing current practices. Results show that prevention strategies examined could not prevent norovirus transmission to food when a symptomatic employee was present in the food establishment. Compliance with exclusion from work of symptomatic food employees is thus critical, with an estimated range of 75–226% of the baseline mean for full to no compliance, respectively. Results also suggest that efficient handwashing, handwashing frequency associated with gloving compliance, and elimination of contact between hands, faucets, and door handles in restrooms reduced the mean number of infected customers to 58%, 62%, and 75% of the baseline, respectively. This study provides quantitative data to evaluate the relative efficacy of policy and practices at retail to reduce norovirus illnesses and provides new insights into the interactions and interplay of prevention strategies and compliance in reducing transmission of foodborne norovirus. 相似文献
14.
Olivier Catelinois Dominique Laurier Pierre Verger Agnès Rogel Marc Colonna Marianne Ignasiak Denis Hémon Margot Tirmarche 《Risk analysis》2005,25(2):243-252
The increase in the thyroid cancer incidence in France observed over the last 20 years has raised public concern about its association with the 1986 nuclear power plant accident at Chernobyl. At the request of French authorities, a first study sought to quantify the possible risk of thyroid cancer associated with the Chernobyl fallout in France. This study suffered from two limitations. The first involved the lack of knowledge of spontaneous thyroid cancer incidence rates (in the absence of exposure), which was especially necessary to take their trends into account for projections over time; the second was the failure to consider the uncertainties. The aim of this article is to enhance the initial thyroid cancer risk assessment for the period 1991-2007 in the area of France most exposed to the fallout (i.e., eastern France) and thereby mitigate these limitations. We consider the changes over time in the incidence of spontaneous thyroid cancer and conduct both uncertainty and sensitivity analyses. The number of spontaneous thyroid cancers was estimated from French cancer registries on the basis of two scenarios: one with a constant incidence, the other using the trend observed. Thyroid doses were estimated from all available data about contamination in France from Chernobyl fallout. Results from a 1995 pooled analysis published by Ron et al. were used to determine the dose-response relation. Depending on the scenario, the number of spontaneous thyroid cancer cases ranges from 894 (90% CI: 869-920) to 1,716 (90% CI: 1,691-1,741). The number of excess thyroid cancer cases predicted ranges from 5 (90% UI: 1-15) to 63 (90% UI: 12-180). All of the assumptions underlying the thyroid cancer risk assessment are discussed. 相似文献
15.
The total ban on use of meat and bone meal (MBM) in livestock feed has been very successful in reducing bovine spongiform encephalopathy (BSE) spread, but also implies a waste of high-quality proteins resulting in economic and ecological loss. Now that the BSE epidemic is fading out, a partial lifting of the MBM ban might be considered. The objective of this study was to assess the BSE risk for the Netherlands if MBM derived from animals fit for human consumption, i.e., category 3 MBM, would be used in nonruminant feed. A stochastic simulation model was constructed that calculates (1) the probability that infectivity of undetected BSE-infected cows ends up with calves and (2) the quantity of infectivity ( Qinf ) consumed by calves in case of such an incident. Three pathways were considered via which infectivity can reach cattle: (1) cross-contamination in the feed mill, (2) cross-contamination on the primary farm, and (3) pasture contamination. Model calculations indicate that the overall probability that infectivity ends up with calves is 3.2%. In most such incidents the Qinf is extremely small (median = 6.5 × 10−12 ID50 ; mean = 1.8 × 10−4 ID50 ), corresponding to an average probability of 1.3 × 10−4 that an incident results in ≥1 new BSE infections. Cross-contamination in the feed mill is the most risky pathway. Combining model results with Dutch BSE prevalence estimates for the coming years, it can be concluded that the BSE risk of using category 3 MBM derived from Dutch cattle in nonruminant feed is very low. 相似文献
16.
Heitor de Oliveira Duarte Enrique Lopez Droguett Márcio das Chagas Moura Elainne Christine de Souza Gomes Constança Barbosa Verônica Barbosa Moacyr Araújo 《Risk analysis》2014,34(5):831-846
We developed a stochastic model for quantitative risk assessment for the Schistosoma mansoni (SM) parasite, which causes an endemic disease of public concern. The model provides answers in a useful format for public health decisions, uses data and expert opinion, and can be applied to any landscape where the snail Biomphalaria glabrata is the main intermediate host (South and Central America, the Caribbean, and Africa). It incorporates several realistic and case‐specific features: stage‐structured parasite populations, periodic praziquantel (PZQ) drug treatment for humans, density dependence, extreme events (prolonged rainfall), site‐specific sanitation quality, environmental stochasticity, monthly rainfall variation, uncertainty in parameters, and spatial dynamics. We parameterize the model through a real‐world application in the district of Porto de Galinhas (PG), one of the main touristic destinations in Brazil, where previous studies identified four parasite populations within the metapopulation. The results provide a good approximation of the dynamics of the system and are in agreement with our field observations, i.e., the lack of basic infrastructure (sanitation level and health programs) makes PG a suitable habitat for the persistence and growth of a parasite metapopulation. We quantify the risk of SM metapopulation explosion and quasi‐extinction and the time to metapopulation explosion and quasi‐extinction. We evaluate the sensitivity of the results under varying scenarios of future periodic PZQ treatment (based on the Brazilian Ministry of Health's plan) and sanitation quality. We conclude that the plan might be useful to slow SM metapopulation growth but not to control it. Additional investments in better sanitation are necessary. 相似文献
17.
William J. Cronin IV Eric J. Oswald Michael L. Shelley Jeffrey W. Fisher Carlyle D. Flemming 《Risk analysis》1995,15(5):555-565
A Monte Carlo simulation is incorporated into a risk assessment for trichloroethylene (TCE) using physiologically-based pharmacokinetic (PBPK) modeling coupled with the linearized multistage model to derive human carcinogenic risk extrapolations. The Monte Carlo technique incorporates physiological parameter variability to produce a statistically derived range of risk estimates which quantifies specific uncertainties associated with PBPK risk assessment approaches. Both inhalation and ingestion exposure routes are addressed. Simulated exposure scenarios were consistent with those used by the Environmental Protection Agency (EPA) in their TCE risk assessment. Mean values of physiological parameters were gathered from the literature for both mice (carcinogenic bioassay subjects) and for humans. Realistic physiological value distributions were assumed using existing data on variability. Mouse cancer bioassay data were correlated to total TCE metabolized and area-under-the-curve (blood concentration) trichloroacetic acid (TCA) as determined by a mouse PBPK model. These internal dose metrics were used in a linearized multistage model analysis to determine dose metric values corresponding to 10-6 lifetime excess cancer risk. Using a human PBPK model, these metabolized doses were then extrapolated to equivalent human exposures (inhalation and ingestion). The Monte Carlo iterations with varying mouse and human physiological parameters produced a range of human exposure concentrations producing a 10-6 risk. 相似文献
18.
Flood Hazard and Flood Risk Assessment Using a Time Series of Satellite Images: A Case Study in Namibia 总被引:3,自引:0,他引:3
In this article, the use of time series of satellite imagery to flood hazard mapping and flood risk assessment is presented. Flooded areas are extracted from satellite images for the flood‐prone territory, and a maximum flood extent image for each flood event is produced. These maps are further fused to determine relative frequency of inundation (RFI). The study shows that RFI values and relative water depth exhibit the same probabilistic distribution, which is confirmed by Kolmogorov‐Smirnov test. The produced RFI map can be used as a flood hazard map, especially in cases when flood modeling is complicated by lack of available data and high uncertainties. The derived RFI map is further used for flood risk assessment. Efficiency of the presented approach is demonstrated for the Katima Mulilo region (Namibia). A time series of Landsat‐5/7 satellite images acquired from 1989 to 2012 is processed to derive RFI map using the presented approach. The following direct damage categories are considered in the study for flood risk assessment: dwelling units, roads, health facilities, and schools. The produced flood risk map shows that the risk is distributed uniformly all over the region. The cities and villages with the highest risk are identified. The proposed approach has minimum data requirements, and RFI maps can be generated rapidly to assist rescuers and decisionmakers in case of emergencies. On the other hand, limitations include: strong dependence on the available data sets, and limitations in simulations with extrapolated water depth values. 相似文献
19.
Recommendations on the Testing and Use of Pseudo-Random Number Generators Used in Monte Carlo Analysis for Risk Assessment 总被引:1,自引:0,他引:1
Timothy M. Barry 《Risk analysis》1996,16(1):93-105
Monte Carlo simulation requires a pseudo-random number generator with good statistical properties. Linear congruential generators (LCGs) are the most popular and well-studied computer method for generating pseudo-random numbers used in Monte Carlo studies. High quality LCGs are available with sufficient statistical quality to satisfy all but the most demanding needs of risk assessors. However, because of the discrete, deterministic nature of LCGs, it is important to evaluate the randomness and uniformity of the specific pseudo-random number subsequences used in important risk assessments. Recommended statistical tests for uniformity and randomness include the Kolmogorov-Smirnov test, extreme values test, and the runs test, including runs above and runs below the mean tests. Risk assessors should evaluate the stability of their risk model's output statistics, paying particular attention to instabilities in the mean and variance. When instabilities in the mean and variance are observed, more stable statistics, e.g., percentiles, should be reported. Analyses should be repeated using several non-overlapping pseudo-random number subsequences. More simulations than those traditionally used are also recommended for each analysis. 相似文献
20.
This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology 4 in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology. 相似文献