首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
Stakeholders making decisions in public health and world trade need improved estimations of the burden‐of‐illness of foodborne infectious diseases. In this article, we propose a Bayesian meta‐analysis or more precisely a Bayesian evidence synthesis to assess the burden‐of‐illness of campylobacteriosis in France. Using this case study, we investigate campylobacteriosis prevalence, as well as the probabilities of different events that guide the disease pathway, by (i) employing a Bayesian approach on French and foreign human studies (from active surveillance systems, laboratory surveys, physician surveys, epidemiological surveys, and so on) through the chain of events that occur during an episode of illness and (ii) including expert knowledge about this chain of events. We split the target population using an exhaustive and exclusive partition based on health status and the level of disease investigation. We assume an approximate multinomial model over this population partition. Thereby, each observed data set related to the partition brings information on the parameters of the multinomial model, improving burden‐of‐illness parameter estimates that can be deduced from the parameters of the basic multinomial model. This multinomial model serves as a core model to perform a Bayesian evidence synthesis. Expert knowledge is introduced by way of pseudo‐data. The result is a global estimation of the burden‐of‐illness parameters with their accompanying uncertainty.  相似文献   

2.
The disease burden of pathogens as estimated by QMRA (quantitative microbial risk assessment) and EA (epidemiological analysis) often differs considerably. This is an unsatisfactory situation for policymakers and scientists. We explored methods to obtain a unified estimate using campylobacteriosis in the Netherlands as an example, where previous work resulted in estimates of 4.9 million (QMRA) and 90,600 (EA) cases per year. Using the maximum likelihood approach and considering EA the gold standard, the QMRA model could produce the original EA estimate by adjusting mainly the dose‐infection relationship. Considering QMRA the gold standard, the EA model could produce the original QMRA estimate by adjusting mainly the probability that a gastroenteritis case is caused by Campylobacter. A joint analysis of QMRA and EA data and models assuming identical outcomes, using a frequentist or Bayesian approach (using vague priors), resulted in estimates of 102,000 or 123,000 campylobacteriosis cases per year, respectively. These were close to the original EA estimate, and this will be related to the dissimilarity in data availability. The Bayesian approach further showed that attenuating the condition of equal outcomes immediately resulted in very different estimates of the number of campylobacteriosis cases per year and that using more informative priors had little effect on the results. In conclusion, EA was dominant in estimating the burden of campylobacteriosis in the Netherlands. However, it must be noted that only statistical uncertainties were taken into account here. Taking all, usually difficult to quantify, uncertainties into account might lead to a different conclusion.  相似文献   

3.
Statistical source attribution approaches of food‐related zoonoses can generally be based on reported diagnosed human cases and surveillance results from different food sources or reservoirs of bacteria. The attribution model, or probabilistic classifier, can thus be based on the (sub)typing information enabling comparison between human infections and samples derived from source surveillance. Having time series of both data allows analyzing temporal patterns over time providing a repeated natural experiment. A Bayesian approach combining both sources of information over a long time series is presented in the case of Campylobacter in Finland and Norway. The full model is transparently presented and derived from the Bayes theorem. Previous statistical source attribution approaches are here advanced (1) by explicit modeling of the cases not associated with any of the sources under surveillance over time, (2) by modeling uncertain prevalence in a food source by bacteria type over time, and (3) by implementing formal model fit assessment using posterior predictive discrepancy functions. Large proportion of all campylobacteriosis can be attributed to broiler, but considerable uncertainty remains over time. The source attribution is inherently incomplete if only the sources under surveillance are included in the model. All statistical source attribution approaches should include a model fit assessment for judgment of model performance with respect to relevant quantities of interest. It is especially relevant when the model aims at a synthesis of several incomplete information sources under significant uncertainty of explanatory variables.  相似文献   

4.
Jocelyne Rocourt 《Risk analysis》2012,32(10):1798-1819
We used a quantitative microbiological risk assessment model to describe the risk of Campylobacter and Salmonella infection linked to chicken meals prepared in households in Dakar, Senegal. The model uses data collected specifically for this study, such as the prevalence and level of bacteria on the neck skin of chickens bought in Dakar markets, time‐temperature profiles recorded from purchase to consumption, an observational survey of meal preparation in private kitchens, and detection and enumeration of pathogens on kitchenware and cooks’ hands. Thorough heating kills all bacteria present on chicken during cooking, but cross‐contamination of cooked chicken or ready‐to‐eat food prepared for the meal via kitchenware and cooks’ hands leads to a high expected frequency of pathogen ingestion. Additionally, significant growth of Salmonella is predicted during food storage at ambient temperature before and after meal preparation. These high exposures lead to a high estimated risk of campylobacteriosis and/or salmonellosis in Dakar households. The public health consequences could be amplified by the high level of antimicrobial resistance of Salmonella and Campylobacter observed in this setting. A significant decrease in the number of ingested bacteria and in the risk could be achieved through a reduction of the prevalence of chicken contamination at slaughter, and by the use of simple hygienic measures in the kitchen. There is an urgent need to reinforce the hygiene education of food handlers in Senegal.  相似文献   

5.
Many farmers in water‐scarce regions of developing countries use wastewater to irrigate vegetables and other agricultural crops, a practice that may expand with climate change. There are a number of health risks associated with wastewater irrigation for human food crops, particularly with surface irrigation techniques common in the developing world. The World Health Organization (WHO) recommends using quantitative microbial risk assessment (QMRA) to determine if the irrigation scheme meets health standards. However, only a few vegetables have been studied for wastewater risk and little information is known about the disease burden of wastewater‐irrigated vegetable consumption in China. To bridge this knowledge gap, an experiment was conducted to determine volume of water left on Asian vegetables and lettuce after irrigation. One hundred samples each of Chinese chard (Brassica rapa var. chinensis), Chinese broccoli (Brassica oleracea var. alboglabra), Chinese flowering cabbage (Brassica rapa var. parachinensis), and lettuce (Lactuca sativa) were harvested after overhead sprinkler irrigation. Chinese broccoli and flowering cabbage were found to capture the most water and lettuce the least. QMRAs were then constructed to estimate rotavirus disease burden from consumption of wastewater‐irrigated Asian vegetables in Beijing. Results indicate that estimated risks from these reuse scenarios exceed WHO guideline thresholds for acceptable disease burden for wastewater use, signifying that reduction of pathogen concentration or stricter risk management is necessary for safe reuse. Considering the widespread practice of wastewater irrigation for food production, particularly in developing countries, incorporation of water retention factors in QMRAs can reduce uncertainty regarding health risks for consumers worldwide.  相似文献   

6.
A Bayesian approach was developed by Hald et al .( 1 ) to estimate the contribution of different food sources to the burden of human salmonellosis in Denmark. This article describes the development of several modifications that can be used to adapt the model to different countries and pathogens. Our modified Hald model has several advantages over the original approach, which include the introduction of uncertainty in the estimates of source prevalence and an improved strategy for identifiability. We have applied our modified model to the two major food-borne zoonoses in New Zealand, namely, campylobacteriosis and salmonellosis. Major challenges were the data quality for salmonellosis and the inclusion of environmental sources of campylobacteriosis. We conclude that by modifying the Hald model we have improved its identifiability, made it more applicable to countries with less intensive surveillance, and feasible for other pathogens, in particular with respect to the inclusion of nonfood sources. The wider application and better understanding of this approach is of particular importance due to the value of the model for decision making and risk management.  相似文献   

7.
To address the persistent problems of foodborne and zoonotic disease, public health officials worldwide face difficult choices about how to best allocate limited resources and target interventions to reduce morbidity and mortality. Data‐driven approaches to informing these decisions have been developed in a number of countries. Integrated comparative frameworks generally share three methodological components: estimating incidence of acute illnesses, chronic sequelae, and mortality; attributing pathogen‐specific illnesses to foods; and calculating integrated measures of disease burden such as cost of illness, willingness to pay, and health‐adjusted life years (HALYs). To discuss the similarities and differences in these approaches, to seek consensus on principles, and to improve international collaboration, the E.U. MED‐VET‐NET and the U.S.‐based Food Safety Research Consortium organized an international conference convened in Berlin, Germany, on July 19–21, 2006. This article draws in part on the deliberations of the conference and discusses general principles, data needs, methodological issues and challenges, and future research needs pertinent to objective data‐driven analyses and their potential use for priority setting of foodborne and zoonotic pathogens in public health policy.  相似文献   

8.
Climate change may impact waterborne and foodborne infectious disease, but to what extent is uncertain. Estimating climate‐change‐associated relative infection risks from exposure to viruses, bacteria, or parasites in water or food is critical for guiding adaptation measures. We present a computational tool for strategic decision making that describes the behavior of pathogens using location‐specific input data under current and projected climate conditions. Pathogen‐pathway combinations are available for exposure to norovirus, Campylobacter, Cryptosporidium, and noncholera Vibrio species via drinking water, bathing water, oysters, or chicken fillets. Infection risk outcomes generated by the tool under current climate conditions correspond with those published in the literature. The tool demonstrates that increasing temperatures lead to increasing risks for infection with Campylobacter from consuming raw/undercooked chicken fillet and for Vibrio from water exposure. Increasing frequencies of drought generally lead to an elevated infection risk of exposure to persistent pathogens such as norovirus and Cryptosporidium, but decreasing risk of exposure to rapidly inactivating pathogens, like Campylobacter. The opposite is the case with increasing annual precipitation; an upsurge of heavy rainfall events leads to more peaks in infection risks in all cases. The interdisciplinary tool presented here can be used to guide climate change adaptation strategies focused on infectious diseases.  相似文献   

9.
Emerging diseases (ED) can have devastating effects on agriculture. Consequently, agricultural insurance for ED can develop if basic insurability criteria are met, including the capability to estimate the severity of ED outbreaks with associated uncertainty. The U.S. farm‐raised channel catfish (Ictalurus punctatus) industry was used to evaluate the feasibility of using a disease spread simulation modeling framework to estimate the potential losses from new ED for agricultural insurance purposes. Two stochastic models were used to simulate the spread of ED between and within channel catfish ponds in Mississippi (MS) under high, medium, and low disease impact scenarios. The mean (95% prediction interval (PI)) proportion of ponds infected within disease‐impacted farms was 7.6% (3.8%, 22.8%), 24.5% (3.8%, 72.0%), and 45.6% (4.0%, 92.3%), and the mean (95% PI) proportion of fish mortalities in ponds affected by the disease was 9.8% (1.4%, 26.7%), 49.2% (4.7%, 60.7%), and 88.3% (85.9%, 90.5%) for the low, medium, and high impact scenarios, respectively. The farm‐level mortality losses from an ED were up to 40.3% of the total farm inventory and can be used for insurance premium rate development. Disease spread modeling provides a systematic way to organize the current knowledge on the ED perils and, ultimately, use this information to help develop actuarially sound agricultural insurance policies and premiums. However, the estimates obtained will include a large amount of uncertainty driven by the stochastic nature of disease outbreaks, by the uncertainty in the frequency of future ED occurrences, and by the often sparse data available from past outbreaks.  相似文献   

10.
Millions of low‐income people of diverse ethnicities inhabit stressful old urban industrial neighborhoods. Yet we know little about the health impacts of built‐environment stressors and risk perceptions in such settings; we lack even basic health profiles. Difficult access is one reason (it took us 30 months to survey 80 households); the lack of multifaceted survey tools is another. We designed and implemented a pilot vulnerability assessment tool in Worcester, Massachusetts. We answer: (1) How can we assess vulnerability to multiple stressors? (2) What is the nature of complex vulnerability—including risk perceptions and health profiles? (3) How can findings be used by our wider community, and what lessons did we learn? (4) What implications arise for science and policy? We sought a holistic picture of neighborhood life. A reasonably representative sample of 80 respondents captured data for 254 people about: demographics, community concerns and resources, time‐activity patterns, health information, risk/stress perceptions, and resources/capacities for coping. Our key findings derive partly from the survey data and partly from our experience in obtaining those data. Data strongly suggest complex vulnerability dominated by psychosocial stress. Unexpected significant gender and ethnic disease disparities emerged: notably, females have twice the disease burden of males, and white females twice the burden of females of color (p < 0.01). Self‐reported depression differentiated by gender and age is illustrative. Community based participatory research (CBPR) approaches require active engagement with marginalized populations, including representatives as funded partners. Complex vulnerability necessitates holistic, participatory approaches to improve scientific understanding and societal responses.  相似文献   

11.
《Risk analysis》2018,38(8):1672-1684
A disease burden (DB) evaluation for environmental pathogens is generally performed using disability‐adjusted life years with the aim of providing a quantitative assessment of the health hazard caused by pathogens. A critical step in the preparation for this evaluation is the estimation of morbidity between exposure and disease occurrence. In this study, the method of a traditional dose–response analysis was first reviewed, and then a combination of the theoretical basis of a “single‐hit” and an “infection‐illness” model was performed by incorporating two critical factors: the “infective coefficient” and “infection duration.” This allowed a dose–morbidity model to be built for direct use in DB calculations. In addition, human experimental data for typical intestinal pathogens were obtained for model validation, and the results indicated that the model was well fitted and could be further used for morbidity estimation. On this basis, a real case of a water reuse project was selected for model application, and the morbidity as well as the DB caused by intestinal pathogens during water reuse was evaluated. The results show that the DB attributed to Enteroviruses was significant, while that for enteric bacteria was negligible. Therefore, water treatment technology should be further improved to reduce the exposure risk of Enteroviruses . Since road flushing was identified as the major exposure route, human contact with reclaimed water through this pathway should be limited. The methodology proposed for model construction not only makes up for missing data of morbidity during risk evaluation, but is also necessary to quantify the maximum possible DB.  相似文献   

12.
Next‐generation sequencing (NGS) data present an untapped potential to improve microbial risk assessment (MRA) through increased specificity and redefinition of the hazard. Most of the MRA models do not account for differences in survivability and virulence among strains. The potential of machine learning algorithms for predicting the risk/health burden at the population level while inputting large and complex NGS data was explored with Listeria monocytogenes as a case study. Listeria data consisted of a percentage similarity matrix from genome assemblies of 38 and 207 strains of clinical and food origin, respectively. Basic Local Alignment (BLAST) was used to align the assemblies against a database of 136 virulence and stress resistance genes. The outcome variable was frequency of illness, which is the percentage of reported cases associated with each strain. These frequency data were discretized into seven ordinal outcome categories and used for supervised machine learning and model selection from five ensemble algorithms. There was no significant difference in accuracy between the models, and support vector machine with linear kernel was chosen for further inference (accuracy of 89% [95% CI: 68%, 97%]). The virulence genes FAM002725, FAM002728, FAM002729, InlF, InlJ, Inlk, IisY, IisD, IisX, IisH, IisB, lmo2026, and FAM003296 were important predictors of higher frequency of illness. InlF was uniquely truncated in the sequence type 121 strains. Most important risk predictor genes occurred at highest prevalence among strains from ready‐to‐eat, dairy, and composite foods. We foresee that the findings and approaches described offer the potential for rethinking the current approaches in MRA.  相似文献   

13.
Cryptosporidium human dose‐response data from seven species/isolates are used to investigate six models of varying complexity that estimate infection probability as a function of dose. Previous models attempt to explicitly account for virulence differences among C. parvum isolates, using three or six species/isolates. Four (two new) models assume species/isolate differences are insignificant and three of these (all but exponential) allow for variable human susceptibility. These three human‐focused models (fractional Poisson, exponential with immunity and beta‐Poisson) are relatively simple yet fit the data significantly better than the more complex isolate‐focused models. Among these three, the one‐parameter fractional Poisson model is the simplest but assumes that all Cryptosporidium oocysts used in the studies were capable of initiating infection. The exponential with immunity model does not require such an assumption and includes the fractional Poisson as a special case. The fractional Poisson model is an upper bound of the exponential with immunity model and applies when all oocysts are capable of initiating infection. The beta Poisson model does not allow an immune human subpopulation; thus infection probability approaches 100% as dose becomes huge. All three of these models predict significantly (>10x) greater risk at the low doses that consumers might receive if exposed through drinking water or other environmental exposure (e.g., 72% vs. 4% infection probability for a one oocyst dose) than previously predicted. This new insight into Cryptosporidium risk suggests additional inactivation and removal via treatment may be needed to meet any specified risk target, such as a suggested 10?4 annual risk of Cryptosporidium infection.  相似文献   

14.
《Risk analysis》2018,38(3):442-453
Infections among health‐care personnel (HCP) occur as a result of providing care to patients with infectious diseases, but surveillance is limited to a few diseases. The objective of this study is to determine the annual number of influenza infections acquired by HCP as a result of occupational exposures to influenza patients in hospitals and emergency departments (EDs) in the United States. A risk analysis approach was taken. A compartmental model was used to estimate the influenza dose received in a single exposure, and a dose–response function applied to calculate the probability of infection. A three‐step algorithm tabulated the total number of influenza infections based on: the total number of occupational exposures (tabulated in previous work), the total number of HCP with occupational exposures, and the probability of infection in an occupational exposure. Estimated influenza infections were highly dependent upon the dose–response function. Given current compliance with infection control precautions, we estimated 151,300 and 34,150 influenza infections annually with two dose–response functions (annual incidence proportions of 9.3% and 2.1%, respectively). Greater reductions in infectious were achieved by full compliance with vaccination and IC precautions than with patient isolation. The burden of occupationally‐acquired influenza among HCP in hospitals and EDs in the United States is not trivial, and can be reduced through improved compliance with vaccination and preventive measures, including engineering and administrative controls.  相似文献   

15.
Many scientists, activists, regulators, and politicians have expressed urgent concern that using antibiotics in food animals selects for resistant strains of bacteria that harm human health and bring nearer a “postantibiotic era” of multidrug resistant “super‐bugs.” Proposed political solutions, such as the Preservation of Antibiotics for Medical Treatment Act (PAMTA), would ban entire classes of subtherapeutic antibiotics (STAs) now used for disease prevention and growth promotion in food animals. The proposed bans are not driven by formal quantitative risk assessment (QRA), but by a perceived need for immediate action to prevent potential catastrophe. Similar fears led to STA phase‐outs in Europe a decade ago. However, QRA and empirical data indicate that continued use of STAs in the United States has not harmed human health, and bans in Europe have not helped human health. The fears motivating PAMTA contrast with QRA estimates of vanishingly small risks. As a case study, examining specific tetracycline uses and resistance patterns suggests that there is no significant human health hazard from continued use of tetracycline in food animals. Simple hypothetical calculations suggest an unobservably small risk (between 0 and 1.75E‐11 excess lifetime risk of a tetracycline‐resistant infection), based on the long history of tetracycline use in the United States without resistance‐related treatment failures. QRAs for other STA uses in food animals also find that human health risks are vanishingly small. Whether such QRA calculations will guide risk management policy for animal antibiotics in the United States remains to be seen.  相似文献   

16.
The purpose of this study was to examine tuberculosis (TB) population dynamics and to assess potential infection risk in Taiwan. A well‐established mathematical model of TB transmission built on previous models was adopted to study the potential impact of TB transmission. A probabilistic risk model was also developed to estimate site‐specific risks of developing disease soon after recent primary infection, exogenous reinfection, or through endogenous reactivation (latently infected TB) among Taiwan regions. Here, we showed that the proportion of endogenous reactivation (53–67%) was larger than that of exogenous reinfection (32–47%). Our simulations showed that as epidemic reaches a steady state, age distribution of cases would finally shift toward older age groups dominated by latently infected TB cases as a result of endogenous reactivation. A comparison of age‐weighted TB incidence data with our model simulation output with 95% credible intervals revealed that the predictions were in an apparent agreement with observed data. The median value of overall basic reproduction number (R0) in eastern Taiwan ranged from 1.65 to 1.72, whereas northern Taiwan had the lowest R0 estimate of 1.50. We found that total TB incidences in eastern Taiwan had 25–27% probabilities of total proportion of infected population exceeding 90%, whereas there were 36–66% probabilities having exceeded 20% of total proportion of infected population attributed to latently infected TB. We suggested that our Taiwan‐based analysis can be extended to the context of developing countries, where TB remains a substantial cause of elderly morbidity and mortality.  相似文献   

17.
Obvious spatial infection patterns are often observed in cases associated with airborne transmissible diseases. Existing quantitative infection risk assessment models analyze the observed cases by assuming a homogeneous infectious particle concentration and ignore the spatial infection pattern, which may cause errors. This study aims at developing an approach to analyze spatial infection patterns associated with infectious respiratory diseases or other airborne transmissible diseases using infection risk assessment and likelihood estimation. Mathematical likelihood, based on binomial probability, was used to formulate the retrospective component with some additional mathematical treatments. Together with an infection risk assessment model that can address spatial heterogeneity, the method can be used to analyze the spatial infection pattern and retrospectively estimate the influencing parameters causing the cases, such as the infectious source strength of the pathogen. A Varicella outbreak was selected to demonstrate the use of the new approach. The infectious source strength estimated by the Wells‐Riley concept using the likelihood estimation was compared with the estimation using the existing method. It was found that the maximum likelihood estimation matches the epidemiological observation of the outbreak case much better than the estimation under the assumption of homogeneous infectious particle concentration. Influencing parameters retrospectively estimated using the new approach can be used as input parameters in quantitative infection risk assessment of the disease under other scenarios. The approach developed in this study can also serve as an epidemiological tool in outbreak investigation. Limitations and further developments are also discussed.  相似文献   

18.
Q fever is a zoonotic disease caused by the intracellular gram‐negative bacterium Coxiella burnetii (C. burnetii), which only multiplies within the phagolysosomal vacuoles. Q fever may manifest as acute or chronic disease. The acute form is generally not fatal and manifestes as self‐controlled febrile illness. Chronic Q fever is usually characterized by endocarditis. Many animal models, including humans, have been studied for Q fever infection through various exposure routes. The studies considered different endpoints including death for animal models and clinical signs for human infection. In this article, animal experimental data available in the open literature were fit to suitable dose‐response models using maximum likelihood estimation. Research results for tests of severe combined immunodeficient mice inoculated intraperitoneally (i.p.) with C. burnetii were best estimated with the Beta‐Poisson dose‐response model. Similar inoculation (i.p.) trial outcomes conducted on C57BL/6J mice were best fit by an exponential model, whereas those tests run on C57BL/10ScN mice were optimally represented by a Beta‐Poisson dose‐response model.  相似文献   

19.
Dose‐response models in microbial risk assessment consider two steps in the process ultimately leading to illness: from exposure to (asymptomatic) infection, and from infection to (symptomatic) illness. Most data and theoretical approaches are available for the exposure‐infection step; the infection‐illness step has received less attention. Furthermore, current microbial risk assessment models do not account for acquired immunity. These limitations may lead to biased risk estimates. We consider effects of both dose dependency of the conditional probability of illness given infection, and acquired immunity to risk estimates, and demonstrate their effects in a case study on exposure to Campylobacter jejuni. To account for acquired immunity in risk estimates, an inflation factor is proposed. The inflation factor depends on the relative rates of loss of protection over exposure. The conditional probability of illness given infection is based on a previously published model, accounting for the within‐host dynamics of illness. We find that at low (average) doses, the infection‐illness model has the greatest impact on risk estimates, whereas at higher (average) doses and/or increased exposure frequencies, the acquired immunity model has the greatest impact. The proposed models are strongly nonlinear, and reducing exposure is not expected to lead to a proportional decrease in risk and, under certain conditions, may even lead to an increase in risk. The impact of different dose‐response models on risk estimates is particularly pronounced when introducing heterogeneity in the population exposure distribution.  相似文献   

20.
Enteric viruses are often detected in water used for crop irrigation. One concern is foodborne viral disease via the consumption of fresh produce irrigated with virus-contaminated water. Although the food industry routinely uses chemical sanitizers to disinfect post-harvest fresh produce, it remains unknown how sanitizer and fresh produce properties affect the risk of viral illness through fresh produce consumption. A quantitative microbial risk assessment model was conducted to estimate (i) the health risks associated with consumption of rotavirus (RV)-contaminated fresh produce with different surface properties (endive and kale) and (ii) how risks changed when using peracetic acid (PAA) or a surfactant-based sanitizer. The modeling results showed that the annual disease burden depended on the combination of sanitizer and vegetable type when vegetables were irrigated with RV-contaminated water. Global sensitivity analyses revealed that the most influential factors in the disease burden were RV concentration in irrigation water and postharvest disinfection efficacy. A postharvest disinfection efficacy of higher than 99% (2-log10) was needed to decrease the disease burden below the World Health Organization (WHO) threshold, even in scenarios with low RV concentrations in irrigation water (i.e., river water). All scenarios tested here with at least 99.9% (3-log10) disinfection efficacy had a disease burden lower than the WHO threshold, except for the endive treated with PAA. The disinfection efficacy for the endive treated with PAA was only about 80%, leading to a disease burden 100 times higher than the WHO threshold. These findings should be considered and incorporated into future models for estimating foodborne viral illness risks.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号