首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 562 毫秒
1.
The intake of methyl‐Hg and EPA + DHA through consumption of seafood in Europe as well as the associated probability of exceeding the provisional tolerable weekly intake (PTWI) and the recommended daily intake (RDI), respectively, were estimated by combining methyl‐Hg and EPA + DHA contents in the five most consumed seafood species with hypothesized consumption distributions for eight European countries, chosen on the basis of size and representative significance. Two estimators were used: plug‐in (PI) and tail estimation (TE). The latter was based on the application of the extreme value theory to the intakes distribution curves. Whereas contents data were collected from own database and published scientific papers, consumption data were obtained from statistical sources of the various countries. Seafood consumption levels varied considerably between countries, from 140 in the United Kingdom to 628.5 g/(person.week) in Iceland. The main consumed species were also different between countries. The probability of exceeding the methyl‐Hg PTWI ranged from 0.04% in the United Kingdom to 9.61% in Iceland. Concerning the probability of exceeding the RDI of EPA + DHA, Iceland was third, after Portugal (66.05%) and Spain (61.05%) and the United Kingdom was the last (0.32%). While TE was most accurate for small probabilities, PI yielded best estimates for larger probabilities.  相似文献   

2.
Inorganic arsenic (iAs), cadmium (Cd), lead (Pb), and methylmercury (MeHg) are toxic metals that cause substantial health concern and are present in various seafood items. This study linked probabilistic risk assessment to the interactive hazard index (HIINT) approach to assess the human mixture risk posed by the dietary intake of iAs, Cd, Pb, and MeHg from seafood for different age populations, and joint toxic actions and toxic interactions among metals were also considered in the assessment. We found that, in combination, an iAs–Cd–Pb–MeHg mixture synergistically causes neurological toxicity. Furthermore, an iAs–Cd–Pb mixture antagonistically causes renal and hematological effects and additively causes cardiovascular effect. Our results demonstrated that if toxic interactions are not considered, the health risk may be overestimated or underestimated. The 50th percentile HIINT estimates in all age populations for neurological, renal, cardiovascular, and hematological effects were lower than 1; however, the 97.5th percentile HIINT estimates might exceed 1. In particular, toddlers and preschoolers had the highest neurological risk, with 0.16 and 0.19 probabilities, respectively, of neurological HIINT exceeding 1. Saltwater fish consumption was the principal contributor to the health risk. We suggest that regular monitoring of metal levels in seafood, more precise dietary surveys, further toxicological data, and risk–benefit analysis of seafood consumption are warranted to improve the accuracy of human mixture risk assessment and determine optimal consumption.  相似文献   

3.
As part of a comprehensive environmental health strategic planning project initiated by the government of Abu Dhabi, we assessed potential dietary exposure in the United Arab Emirates (UAE) to methylmercury (in seafood) and pesticides (in fruits and vegetables) above international guideline levels. We present results for the UAE population by age, gender, and body mass index. Our results show very low daily risks of exposure to pesticides in fruits and vegetables at levels exceeding WHO guidelines even under the conservative assumption that no pesticides are removed during washing and food preparation. Thus, exposure to pesticides on fruits and vegetables does not appear to be a major public health concern in the UAE. The chances of exposure to methylmercury in seafood are much higher; our model estimates a mean 1 in 5 daily risk of exceeding the FAO/WHO provisional tolerable weekly intake. However, great caution should be used in interpreting these results, as we analyzed only the risks and not the substantial benefits of fish consumption. In fact, previous studies have demonstrated that exposure to the n‐3 polyunsaturated fatty acids in fish can increase IQ in developing children, and it can substantially decrease the risk in adults of coronary heart disease and stroke. Further research is warranted to compare the risk of Me‐Hg exposure from fish to the nutritional benefits of fish consumption in the UAE and to determine appropriate methods to communicate risk and benefit information to the UAE population.  相似文献   

4.
Significant quantities of toxic metals are emitted to the air by the incineration of waste, as well as by the combustion of coal and oil. To optimize the regulations for their emissions one needs to know the cost of their damage. That requires an impact pathway analysis, with realistic dispersion models, exposure‐response functions, and monetary values. In this article we explain the method and assumptions and present results for arsenic, cadmium, mercury, and lead, the most important toxic metals in terms of damage cost. We also estimate their contribution to the damage cost of waste incineration and electric power from coal for typical situations in Europe. The damage costs of As, Cd, and Pb are much higher than previous estimates because of a large number of new epidemiological studies, implying more and more serious health effects than what had been known before. New cost‐benefit studies for the abatement of toxic metal emissions are advisable. The discussion of the epidemiological studies and the derivation of exposure‐response functions are presented in two companion articles, one for As and Cd, the other for Hg and Pb.  相似文献   

5.
The risk analysis of the health impact of foods is increasingly focused on integrated risk‐benefit assessment, which will also need to be communicated to consumers. It therefore becomes important to understand how consumers respond to integrated risk‐benefit information. Quality‐adjusted‐life‐years (QALYs) is one measure that can be used to assess the balance between risks and benefits associated with a particular food. The effectiveness of QALYs for communicating both positive and negative health effects associated with food consumption to consumers was examined, using a 3 × 2 experiment varying information about health changes in terms of QALYs associated with the consumption of fish (n = 325). The effect of this information on consumer perceptions of the usefulness of QALYs for describing health effects, on risk and benefit perceptions, attitudes, and intentions to consume fish was examined. Results demonstrated that consumers perceived QALYs as useful for communicating health effects associated with food consumption. QALYs communicated as a net effect were preferred for food products associated with negative net effects on health, while separate communication of both risks and benefits may be preferred for food products associated with positive or zero net health effects. Information about health changes in terms of QALYs facilitated informed decision making by consumers, as indicated by the impact on risk and benefit perceptions as intended by the information. The impact of this information on actual food consumption choices merits further investigation.  相似文献   

6.
The objective of the present study was to integrate the relative risk from mercury exposure to stream biota, groundwater, and humans in the Río Artiguas (Sucio) river basin, Nicaragua, where local gold mining occurs. A hazard quotient was used as a common exchange rate in probabilistic estimations of exposure and effects by means of Monte Carlo simulations. The endpoint for stream organisms was the lethal no‐observed‐effect concentration (NOECs), for groundwater the WHO guideline and the inhibitory Hg concentrations in bacteria (IC), and for humans the tolerable daily intake (TDI) and the benchmark dose level with an uncertainty factor of 10 (BMDLs0.1). Macroinvertebrates and fish in the contaminated river are faced with a higher risk to suffer from exposure to Hg than humans eating contaminated fish and bacteria living in the groundwater. The river sediment is the most hazardous source for the macroinvertebrates, and macroinvertebrates make up the highest risk for fish. The distribution of body concentrations of Hg in fish in the mining areas of the basin may exceed the distribution of endpoint values with close to 100% probability. Similarly, the Hg concentration in cord blood of humans feeding on fish from the river was predicted to exceed the BMDLs0.1 with about 10% probability. Most of the risk to the groundwater quality is confined to the vicinity of the gold refining plants and along the river, with a probability of about 20% to exceed the guideline value.  相似文献   

7.
We analyze the risk of contracting illness due to the consumption in the United States of hamburgers contaminated with enterohemorrhagic Escherichia coli (EHEC) of serogroup O157 produced from manufacturing beef imported from Australia. We have used a novel approach for estimating risk by using the prevalence and concentration estimates of E. coli O157 in lots of beef that were withdrawn from the export chain following detection of the pathogen. For the purpose of the present assessment an assumption was that no product is removed from the supply chain following testing. This, together with a number of additional conservative assumptions, leads to an overestimation of E. coli O157‐associated illness attributable to the consumption of ground beef patties manufactured only from Australian beef. We predict 49.6 illnesses (95%: 0.0–148.6) from the 2.46 billion hamburgers made from 155,000 t of Australian manufacturing beef exported to the United States in 2012. All these illness were due to undercooking in the home and less than one illness is predicted from consumption of hamburgers cooked to a temperature of 68 °C in quick‐service restaurants.  相似文献   

8.
This paper presents a new approach to estimation and inference in panel data models with a general multifactor error structure. The unobserved factors and the individual‐specific errors are allowed to follow arbitrary stationary processes, and the number of unobserved factors need not be estimated. The basic idea is to filter the individual‐specific regressors by means of cross‐section averages such that asymptotically as the cross‐section dimension (N) tends to infinity, the differential effects of unobserved common factors are eliminated. The estimation procedure has the advantage that it can be computed by least squares applied to auxiliary regressions where the observed regressors are augmented with cross‐sectional averages of the dependent variable and the individual‐specific regressors. A number of estimators (referred to as common correlated effects (CCE) estimators) are proposed and their asymptotic distributions are derived. The small sample properties of mean group and pooled CCE estimators are investigated by Monte Carlo experiments, showing that the CCE estimators have satisfactory small sample properties even under a substantial degree of heterogeneity and dynamics, and for relatively small values of N and T.  相似文献   

9.
This paper establishes the higher‐order equivalence of the k‐step bootstrap, introduced recently by Davidson and MacKinnon (1999), and the standard bootstrap. The k‐step bootstrap is a very attractive alternative computationally to the standard bootstrap for statistics based on nonlinear extremum estimators, such as generalized method of moment and maximum likelihood estimators. The paper also extends results of Hall and Horowitz (1996) to provide new results regarding the higher‐order improvements of the standard bootstrap and the k‐step bootstrap for extremum estimators (compared to procedures based on first‐order asymptotics). The results of the paper apply to Newton‐Raphson (NR), default NR, line‐search NR, and Gauss‐Newton k‐step bootstrap procedures. The results apply to the nonparametric iid bootstrap and nonoverlapping and overlapping block bootstraps. The results cover symmetric and equal‐tailed two‐sided t tests and confidence intervals, one‐sided t tests and confidence intervals, Wald tests and confidence regions, and J tests of over‐identifying restrictions.  相似文献   

10.
The public health significance of transmission of ESBL‐producing Escherichia coli and Campylobacter from poultry farms to humans through flies was investigated using a worst‐case risk model. Human exposure was modeled by the fraction of contaminated flies, the number of specific bacteria per fly, the number of flies leaving the poultry farm, and the number of positive poultry houses in the Netherlands. Simplified risk calculations for transmission through consumption of chicken fillet were used for comparison, in terms of the number of human exposures, the total human exposure, and, for Campylobacter only, the number of human cases of illness. Comparing estimates of the worst‐case risk of transmission through flies with estimates of the real risk of chicken fillet consumption, the number of human exposures to ESBL‐producing E. coli was higher for chicken fillet as compared with flies, but the total level of exposure was higher for flies. For Campylobacter, risk values were nearly consistently higher for transmission through flies than for chicken fillet consumption. This indicates that the public health risk of transmission of both ESBL‐producing E. coli and Campylobacter to humans through flies might be of importance. It justifies further modeling of transmission through flies for which additional data (fly emigration, human exposure) are required. Similar analyses of other environmental transmission routes from poultry farms are suggested to precede further investigations into flies.  相似文献   

11.
《Risk analysis》2018,38(5):1070-1084
Human exposure to bacteria resistant to antimicrobials and transfer of related genes is a complex issue and occurs, among other pathways, via meat consumption. In a context of limited resources, the prioritization of risk management activities is essential. Since the antimicrobial resistance (AMR) situation differs substantially between countries, prioritization should be country specific. The objective of this study was to develop a systematic and transparent framework to rank combinations of bacteria species resistant to selected antimicrobial classes found in meat, based on the risk they represent for public health in Switzerland. A risk assessment model from slaughter to consumption was developed following the Codex Alimentarius guidelines for risk analysis of foodborne AMR. Using data from the Swiss AMR monitoring program, 208 combinations of animal species/bacteria/antimicrobial classes were identified as relevant hazards. Exposure assessment and hazard characterization scores were developed and combined using multicriteria decision analysis. The effect of changing weights of scores was explored with sensitivity analysis. Attributing equal weights to each score, poultry‐associated combinations represented the highest risk. In particular, contamination with extended‐spectrum β‐lactamase/plasmidic AmpC‐producing Escherichia coli in poultry meat ranked high for both exposure and hazard characterization. Tetracycline‐ or macrolide‐resistant Enterococcus spp., as well as fluoroquinolone‐ or macrolide‐resistant Campylobacter jejuni , ranked among combinations with the highest risk. This study provides a basis for prioritizing future activities to mitigate the risk associated with foodborne AMR in Switzerland. A user‐friendly version of the model was provided to risk managers; it can easily be adjusted to the constantly evolving knowledge on AMR.  相似文献   

12.
This paper proposes two new estimators for determining the number of factors (r) in static approximate factor models. We exploit the well‐known fact that the r largest eigenvalues of the variance matrix of N response variables grow unboundedly as N increases, while the other eigenvalues remain bounded. The new estimators are obtained simply by maximizing the ratio of two adjacent eigenvalues. Our simulation results provide promising evidence for the two estimators.  相似文献   

13.
Matching estimators for average treatment effects are widely used in evaluation research despite the fact that their large sample properties have not been established in many cases. The absence of formal results in this area may be partly due to the fact that standard asymptotic expansions do not apply to matching estimators with a fixed number of matches because such estimators are highly nonsmooth functionals of the data. In this article we develop new methods for analyzing the large sample properties of matching estimators and establish a number of new results. We focus on matching with replacement with a fixed number of matches. First, we show that matching estimators are not N1/2‐consistent in general and describe conditions under which matching estimators do attain N1/2‐consistency. Second, we show that even in settings where matching estimators are N1/2‐consistent, simple matching estimators with a fixed number of matches do not attain the semiparametric efficiency bound. Third, we provide a consistent estimator for the large sample variance that does not require consistent nonparametric estimation of unknown functions. Software for implementing these methods is available in Matlab, Stata, and R.  相似文献   

14.
The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail‐to‐table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross‐contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research.  相似文献   

15.
This article describes a simple model for quantifying the health impacts of toxic metal emissions. In contrast to most traditional models it calculates the expectation value of the total damage (summed over the total population and over all time) for typical emission sites, rather than "worst-case" estimates for specific sites or episodes. Such a model is needed for the evaluation of many environmental policy measures, e.g., the optimal level of pollution taxes or emission limits. Based on the methodology that has been developed by USEPA for the assessment of multimedia pathways, the equations and parameters are assembled for the assessment of As, Cd, Cr, Hg, Ni, and Pb, and some typical results are presented (the dose from seafood is not included and for Hg the results are extremely uncertain); the model is freely available on the web. The structure of the model is very simple because, as we show, if the parameters can be approximated by time-independent constants (the case for the USEPA methodology), the total impacts can be calculated with steady-state models even though the environment is never in steady state. The collective ingestion dose is found to be roughly 2 orders of magnitude larger than the collective dose via inhalation. The uncertainties are large, easily an order of magnitude, the main uncertainties arising from the parameter values of the model, in particular the transfer factors. Using linearized dose-response functions, estimates are provided for cancers due to As, Cd, Cr, and Ni as well as IQ loss due to Pb emissions in Europe.  相似文献   

16.
We develop a new specification test for IV estimators adopting a particular second order approximation of Bekker. The new specification test compares the difference of the forward (conventional) 2SLS estimator of the coefficient of the right‐hand side endogenous variable with the reverse 2SLS estimator of the same unknown parameter when the normalization is changed. Under the null hypothesis that conventional first order asymptotics provide a reliable guide to inference, the two estimates should be very similar. Our test sees whether the resulting difference in the two estimates satisfies the results of second order asymptotic theory. Essentially the same idea is applied to develop another new specification test using second‐order unbiased estimators of the type first proposed by Nagar. If the forward and reverse Nagar‐type estimators are not significantly different we recommend estimation by LIML, which we demonstrate is the optimal linear combination of the Nagar‐type estimators (to second order). We also demonstrate the high degree of similarity for k‐class estimators between the approach of Bekker and the Edgeworth expansion approach of Rothenberg. An empirical example and Monte Carlo evidence demonstrate the operation of the new specification test.  相似文献   

17.
This paper explores the quantitative asset‐pricing implications of expectations‐based reference‐dependent preferences, as introduced by Koszegi and Rabin (2009, American Economic Review, 99(3), 909–936), in an otherwise traditional Lucas‐tree model. I find that the model easily succeeds in matching the historical equity premium and its variability when the preference parameters are calibrated in line with micro evidence. The equity premium is high because expectations‐based loss aversion makes uncertain fluctuations in consumption more painful. Additionally, loss aversion introduces variation in returns because unexpected cuts in consumption are particularly painful, and the agent wants to postpone such cuts to let his reference point decrease. This variation generates strong predictability. However, it also causes counterfactually high volatility in the risk‐free rate, which I address by allowing for variation in expected consumption growth, heteroskedasticity in consumption growth, time‐variant disaster risk, and sluggish belief updating.  相似文献   

18.
This paper considers regression models for cross‐section data that exhibit cross‐section dependence due to common shocks, such as macroeconomic shocks. The paper analyzes the properties of least squares (LS) estimators in this context. The results of the paper allow for any form of cross‐section dependence and heterogeneity across population units. The probability limits of the LS estimators are determined, and necessary and sufficient conditions are given for consistency. The asymptotic distributions of the estimators are found to be mixed normal after recentering and scaling. The t, Wald, and F statistics are found to have asymptotic standard normal, χ2, and scaled χ2 distributions, respectively, under the null hypothesis when the conditions required for consistency of the parameter under test hold. However, the absolute values of t, Wald, and F statistics are found to diverge to infinity under the null hypothesis when these conditions fail. Confidence intervals exhibit similarly dichotomous behavior. Hence, common shocks are found to be innocuous in some circumstances, but quite problematic in others. Models with factor structures for errors and regressors are considered. Using the general results, conditions are determined under which consistency of the LS estimators holds and fails in models with factor structures. The results are extended to cover heterogeneous and functional factor structures in which common factors have different impacts on different population units.  相似文献   

19.
《Risk analysis》2018,38(8):1738-1757
We developed a risk assessment of human salmonellosis associated with consumption of alfalfa sprouts in the United States to evaluate the public health impact of applying treatments to seeds (0–5‐log10 reduction in Salmonella ) and testing spent irrigation water (SIW) during production. The risk model considered variability and uncertainty in Salmonella contamination in seeds, Salmonella growth and spread during sprout production, sprout consumption, and Salmonella dose response. Based on an estimated prevalence of 2.35% for 6.8 kg seed batches and without interventions, the model predicted 76,600 (95% confidence interval (CI) 15,400 – 248,000) cases/year. Risk reduction (by 5 ‐ to 7‐fold) predicted from a 1‐log10 seed treatment alone was comparable to SIW testing alone, and each additional 1‐log10 seed treatment was predicted to provide a greater risk reduction than SIW testing. A 3‐log10 or a 5‐log10 seed treatment reduced the predicted cases/year to 139 (95% CI 33 – 448) or 1.4 (95% CI <1 – 4.5), respectively. Combined with SIW testing, a 3‐log10 or 5‐log10 seed treatment reduced the cases/year to 45 (95% CI 10–146) or <1 (95% CI <1 – 1.5), respectively. If the SIW coverage was less complete (i.e., less representative), a smaller risk reduction was predicted, e.g., a combined 3‐log10 seed treatment and SIW testing with 20% coverage resulted in an estimated 92 (95% CI 22 – 298) cases/year. Analysis of alternative scenarios using different assumptions for key model inputs showed that the predicted relative risk reductions are robust. This risk assessment provides a comprehensive approach for evaluating the public health impact of various interventions in a sprout production system.  相似文献   

20.
The availability of high frequency financial data has generated a series of estimators based on intra‐day data, improving the quality of large areas of financial econometrics. However, estimating the standard error of these estimators is often challenging. The root of the problem is that traditionally, standard errors rely on estimating a theoretically derived asymptotic variance, and often this asymptotic variance involves substantially more complex quantities than the original parameter to be estimated. Standard errors are important: they are used to assess the precision of estimators in the form of confidence intervals, to create “feasible statistics” for testing, to build forecasting models based on, say, daily estimates, and also to optimize the tuning parameters. The contribution of this paper is to provide an alternative and general solution to this problem, which we call Observed Asymptotic Variance. It is a general nonparametric method for assessing asymptotic variance (AVAR). It provides consistent estimators of AVAR for a broad class of integrated parameters Θ = ∫ θt dt, where the spot parameter process θ can be a general semimartingale, with continuous and jump components. The observed AVAR is implemented with the help of a two‐scales method. Its construction works well in the presence of microstructure noise, and when the observation times are irregular or asynchronous in the multivariate case. The methodology is valid for a wide variety of estimators, including the standard ones for variance and covariance, and also for more complex estimators, such as, of leverage effects, high frequency betas, and semivariance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号