首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
T. Walton 《Risk analysis》2012,32(7):1122-1138
Through the use of case‐control analyses and quantitative microbial risk assessment (QMRA), relative risks of transmission of cryptosporidiosis have been evaluated (recreational water exposure vs. drinking water consumption) for a Canadian community with higher than national rates of cryptosporidiosis. A QMRA was developed to assess the risk of Cryptosporidium infection through the consumption of municipally treated drinking water. Simulations were based on site‐specific surface water contamination levels and drinking water treatment log10 reduction capacity for Cryptosporidium. Results suggested that the risk of Cryptosporidium infection via drinking water in the study community, assuming routine operation of the water treatment plant, was negligible (6 infections per 1013 persons per day—5th percentile: 2 infections per 1015 persons per day; 95th percentile: 3 infections per 1012 persons per day). The risk is essentially nonexistent during optimized, routine treatment operations. The study community achieves between 7 and 9 log10Cryptosporidium oocyst reduction through routine water treatment processes. Although these results do not preclude the need for constant vigilance by both water treatment and public health professionals in this community, they suggest that the cause of higher rates of cryptosporidiosis are more likely due to recreational water contact, or perhaps direct animal contact. QMRA can be successfully applied at the community level to identify data gaps, rank relative public health risks, and forecast future risk scenarios. It is most useful when performed in a collaborative way with local stakeholders, from beginning to end of the risk analysis paradigm.  相似文献   

2.
《Risk analysis》2018,38(4):638-652
The objective of this research was to analyze the impact of different cooking procedures (i.e., gas hob and traditional static oven) and levels of cooking (i.e., rare, medium, and well‐done) on inactivation of Listeria monocytogenes and Salmonella in pork loin chops. Moreover, the consumer's exposure to both microorganisms after simulation of meat leftover storage at home was assessed. The results showed that well‐done cooking in a static oven was the only treatment able to inactivate the tested pathogens. The other cooking combinations allowed to reach in the product temperatures always ≥73.6 °C, decreasing both pathogens between 6 log10 cfu/g and 7 log10 cfu/g. However, according to simulation results, the few cells surviving cooking treatments can multiply during storage by consumers up to 1 log10 cfu/g, with probabilities of 0.059 (gas hob) and 0.035 (static oven) for L. monocytogenes and 0.049 (gas hob) and 0.031 (static oven) for Salmonella . The key factors affecting consumer exposure in relation to storage practices were probability of pathogen occurrence after cooking, doneness degree, time of storage, and time of storage at room temperature. The results of this study can be combined with prevalence data and dose–response models in risk assessment models and included in guidelines for consumers on practices to be followed to manage cooking of pork meat at home.  相似文献   

3.
We reanalyzed the Libby vermiculite miners’ cohort assembled by Sullivan to estimate potency factors for lung cancer, mesothelioma, nonmalignant respiratory disease (NMRD), and all‐cause mortality associated with exposure to Libby fibers. Our principal statistical tool for analyses of lung cancer, NMRD, and total mortality in the cohort was the time‐dependent proportional hazards model. For mesothelioma, we used an extension of the Peto formula. For a cumulative exposure to Libby fiber of 100 f/mL‐yr, our estimates of relative risk (RR) are as follows: lung cancer, RR = 1.12, 95% confidence interval (CI) =[1.06, 1.17]; NMRD, RR = 1.14, 95% CI =[1.09, 1.18]; total mortality, RR = 1.06, 95% CI =[1.04, 1.08]. These estimates were virtually identical when analyses were restricted to the subcohort of workers who were employed for at least one year. For mesothelioma, our estimate of potency is KM = 0.5 × 10?8, 95% CI =[0.3 × 10?8, 0.8 × 10?8]. Finally, we estimated the mortality ratios standardized against the U.S. population for lung cancer, NMRD, and total mortality and obtained estimates that were in good agreement with those reported by Sullivan. The estimated potency factors form the basis for a quantitative risk assessment at Libby.  相似文献   

4.
Quantitative microbial risk assessment was used to assess the risk of norovirus gastroenteritis associated with consumption of raw vegetables irrigated with highly treated municipal wastewater, using Melbourne, Australia as an example. In the absence of local norovirus concentrations, three methods were developed: (1) published concentrations of norovirus in raw sewage, (2) an epidemiological method using Melbourne prevalence of norovirus, and (3) an adjustment of method 1 to account for prevalence of norovirus. The methods produced highly variable results with estimates of norovirus concentrations in raw sewage ranging from 104 per milliliter to 107 per milliliter and treated effluent from 1 × 10?3 per milliliter to 3 per milliliter (95th percentiles). Annual disease burden was very low using method 1, from 4 to 5 log10 disability adjusted life years (DALYs) below the 10?6 threshold (0.005–0.1 illnesses per year). Results of method 2 were higher, with some scenarios exceeding the threshold by up to 2 log10 DALYs (up to 95,000 illnesses per year). Method 3, thought to be most representative of Melbourne conditions, predicted annual disease burdens >2 log10 DALYs lower than the threshold (~4 additional cases per year). Sensitivity analyses demonstrated that input parameters used to estimate norovirus concentration accounted for much of the model output variability. This model, while constrained by a lack of knowledge of sewage concentrations, used the best available information and sound logic. Results suggest that current wastewater reuse behaviors in Melbourne are unlikely to cause norovirus risks in excess of the annual DALY health target.  相似文献   

5.
6.
In this article, the performance objectives (POs) for Bacillus cereus group (BC) in celery, cheese, and spelt added as ingredients in a ready‐to‐eat mixed spelt salad, packaged under modified atmosphere, were calculated using a Bayesian approach. In order to derive the POs, BC detection and enumeration were performed in nine lots of naturally contaminated ingredients and final product. Moreover, the impact of specific production steps on the BC contamination was quantified. Finally, a sampling plan to verify the ingredient lots' compliance with each PO value at a 95% confidence level (CL) was defined. To calculate the POs, detection results as well as results above the limit of detection but below the limit of quantification (i.e., censored data) were analyzed. The most probable distribution of the censored data was determined and two‐dimensional (2D) Monte Carlo simulations were performed. The PO values were calculated to meet a food safety objective of 4 log10 cfu of BC for g of spelt salad at the time of consumption. When BC grows during storage between 0.90 and 1.90 log10 cfu/g, the POs for BC in celery, cheese, and spelt ranged between 1.21 log10 cfu/g for celery and 2.45 log10 cfu/g for spelt. This article represents the first attempt to manage the concept of PO and 2D Monte Carlo simulation in the flow chart of a complex food matrix, including raw and cooked ingredients.  相似文献   

7.
To prevent and control foodborne diseases, there is a fundamental need to identify the foods that are most likely to cause illness. The goal of this study was to rank 25 commonly consumed food products associated with Salmonella enterica contamination in the Central Region of Mexico. A multicriteria decision analysis (MCDA) framework was developed to obtain an S. enterica risk score for each food product based on four criteria: probability of exposure to S. enterica through domestic food consumption (Se); S. enterica growth potential during home storage (Sg); per capita consumption (Pcc); and food attribution of S. enterica outbreak (So). Risk scores were calculated by the equation Se*W1+Sg*W2+Pcc*W3+So*W4, where each criterion was assigned a normalized value (1–5) and the relative weights (W) were defined by 22 experts’ opinion. Se had the largest effect on the risk score being the criterion with the highest weight (35%; IC95% 20%–60%), followed by So (24%; 5%–50%), Sg (23%; 10%–40%), and Pcc (18%; 10%–35%). The results identified chicken (4.4 ± 0.6), pork (4.2 ± 0.6), and beef (4.2 ± 0.5) as the highest risk foods, followed by seed fruits (3.6 ± 0.5), tropical fruits (3.4 ± 0.4), and dried fruits and nuts (3.4 ± 0.5), while the food products with the lowest risk were yogurt (2.1 ± 0.3), chorizo (2.1 ± 0.4), and cream (2.0 ± 0.3). Approaches with expert-based weighting and equal weighting showed good correlation (R= 0.96) and did not show significant differences among the ranking order in the top 20 tier. This study can help risk managers select interventions and develop targeted surveillance programs against S. enterica in high-risk food products.  相似文献   

8.
A. de Koeijer 《Risk analysis》2012,32(12):2198-2208
A predictive case‐cohort model was applied to Japanese data to analyze the interaction between challenge and stability factors for bovine spongiform encephalopathy (BSE) for the period 1985–2020. BSE risk in cattle was estimated as the expected number of detectable cases per year. The model was comprised of a stochastic spreadsheet calculation model with the following inputs: (1) the origin and quantity of live cattle and meat and bone meal imported into Japan, (2) the age distribution of native cattle, and (3) the estimated annual basic reproduction ratio (R0) for BSE. The estimated probability of having zero detectable cases in Japan in 2015 was 0.90 (95% CI 0.83–0.95). The corresponding value for 2020 was 0.99 (95% CI 0.98–0.99). The model predicted that detectable cases may occur in Japan beyond 2015 because of the assumption that continued transmission was permitted to occur (albeit at a very low level) after the 2001 ban on the importation and domestic use of all processed animal proteins for the production of animal feed and for fertilizer. These results reinforce the need for animal health authorities to monitor the efficacy of control measures so that the future course of the BSE epidemic in Japan can be predicted with greater certainty.  相似文献   

9.
This study illustrates a newly developed methodology, as a part of the U.S. EPA ecological risk assessment (ERA) framework, to predict exposure concentrations in a marine environment due to underwater release of oil and gas. It combines the hydrodynamics of underwater blowout, weathering algorithms, and multimedia fate and transport to measure the exposure concentration. Naphthalene and methane are used as surrogate compounds for oil and gas, respectively. Uncertainties are accounted for in multimedia input parameters in the analysis. The 95th percentile of the exposure concentration (EC95%) is taken as the representative exposure concentration for the risk estimation. A bootstrapping method is utilized to characterize EC95% and associated uncertainty. The toxicity data of 19 species available in the literature are used to calculate the 5th percentile of the predicted no observed effect concentration (PNEC5%) by employing the bootstrapping method. The risk is characterized by transforming the risk quotient (RQ), which is the ratio of EC95% to PNEC5%, into a cumulative risk distribution. This article describes a probabilistic basis for the ERA, which is essential from risk management and decision‐making viewpoints. Two case studies of underwater oil and gas mixture release, and oil release with no gaseous mixture are used to show the systematic implementation of the methodology, elements of ERA, and the probabilistic method in assessing and characterizing the risk.  相似文献   

10.
The U.S. Environmental Protection Agency's cancer guidelines ( USEPA, 2005 ) present the default approach for the cancer slope factor (denoted here as s*) as the slope of the linear extrapolation to the origin, generally drawn from the 95% lower confidence limit on dose at the lowest prescribed risk level supported by the data. In the past, the cancer slope factor has been calculated as the upper 95% confidence limit on the coefficient (q*1) of the linear term of the multistage model for the extra cancer risk over background. To what extent do the two approaches differ in practice? We addressed this issue by calculating s* and q*1 for 102 data sets for 60 carcinogens using the constrained multistage model to fit the dose‐response data. We also examined how frequently the fitted dose‐response curves departed appreciably from linearity at low dose by comparing q1, the coefficient of the linear term in the multistage polynomial, with a slope factor, sc, derived from a point of departure based on the maximum liklihood estimate of the dose‐response. Another question we addressed is the extent to which s* exceeded sc for various levels of extra risk. For the vast majority of chemicals, the prescribed default EPA methodology for the cancer slope factor provides values very similar to that obtained with the traditionally estimated q*1. At 10% extra risk, q*1/s* is greater than 0.3 for all except one data set; for 82% of the data sets, q*1 is within 0.9 to 1.1 of s*. At the 10% response level, the interquartile range of the ratio, s*/sc, is 1.4 to 2.0.  相似文献   

11.
Some viruses cause tumor regression and can be used to treat cancer patients; these viruses are called oncolytic viruses. To assess whether oncolytic viruses from animal origin excreted by patients pose a health risk for livestock, a quantitative risk assessment (QRA) was performed to estimate the risk for the Dutch pig industry after environmental release of Seneca Valley virus (SVV). The QRA assumed SVV excretion in stool by one cancer patient on Day 1 in the Netherlands, discharge of SVV with treated wastewater into the river Meuse, downstream intake of river water for drinking water production, and consumption of this drinking water by pigs. Dose–response curves for SVV infection and clinical disease in pigs were constructed from experimental data. In the worst scenario (four log10 virus reduction by drinking water treatment and a farm with 10,000 pigs), the infection risk is less than 1% with 95% certainty. The risk of clinical disease is almost seven orders of magnitude lower. Risks may increase proportionally with the numbers of treated patients and days of virus excretion. These data indicate that application of wild‐type oncolytic animal viruses may infect susceptible livestock. A QRA regarding the use of oncolytic animal virus is, therefore, highly recommended. For this, data on excretion by patients, and dose–response parameters for infection and clinical disease in livestock, should be studied.  相似文献   

12.
Rural communities dependent on unregulated drinking water are potentially at increased health risk from exposure to contaminants. Perception of drinking water safety influences water consumption, exposure, and health risk. A community‐based participatory approach and probabilistic Bayesian methods were applied to integrate risk perception in a holistic human health risk assessment. Tap water arsenic concentrations and risk perception data were collected from two Saskatchewan communities. Drinking water health standards were exceeded in 67% (51/76) of households in Rural Municipality #184 (RM184) and 56% (25/45) in Beardy's and Okemasis First Nation (BOFN). There was no association between the presence of a health exceedance and risk perception. Households in RM184 or with an annual income >$50,000 were most likely to have in‐house water treatment. The probability of consuming tap water perceived as safe (92%) or not safe (0%) suggested that households in RM184 were unlikely to drink water perceived as not safe. The probability of drinking tap water perceived as safe (77%) or as not safe (11%) suggested households in BOFN contradicted their perception and consumed water perceived as unsafe. Integration of risk perception lowered the adult incremental lifetime cancer risk by 3% to 1.3 × 10?5 (95% CI 8.4 × 10?8 to 9.0 × 10?5) for RM184 and by 8.9 × 10?6 (95% CI 2.2 × 10?7 to 5.9 × 10?5) for BOFN. Probability of exposure to arsenic concentrations >1:100,000, negligible cancer risk, was 23% for RM184 and 22% for BOFN.  相似文献   

13.
Quantitative microbiological risk assessment was used to quantify the risk associated with the exposure to Legionella pneumophila in a whirlpool. Conceptually, air bubbles ascend to the surface, intercepting Legionella from the traversed water. At the surface the bubble bursts into dominantly noninhalable jet drops and inhalable film drops. Assuming that film drops carry half of the intercepted Legionella, a total of four (95% interval: 1–9) and 4.5×104 (4.4×104 – 4.7×104) cfu/min were estimated to be aerosolized for concentrations of 1 and 1,000 legionellas per liter, respectively. Using a dose‐response model for guinea pigs to represent humans, infection risks for active whirlpool use with 100 cfu/L water for 15 minutes were 0.29 (~0.11–0.48) for susceptible males and 0.22 (~0.06–0.42) for susceptible females. A L. pneumophila concentration of ≥1,000 cfu/L water was estimated to nearly always cause an infection (mean: 0.95; 95% interval: 0.9–~1). Estimated infection risks were time‐dependent, ranging from 0.02 (0–0.11) for 1‐minute exposures to 0.93 (0.86–0.97) for 2‐hour exposures when the L. pneumophila concentration was 100 cfu/L water. Pool water in Dutch bathing establishments should contain <100 cfu Legionella/L water. This study suggests that stricter provisions might be required to assure adequate public health protection.  相似文献   

14.
The objective of the present study was to integrate the relative risk from mercury exposure to stream biota, groundwater, and humans in the Río Artiguas (Sucio) river basin, Nicaragua, where local gold mining occurs. A hazard quotient was used as a common exchange rate in probabilistic estimations of exposure and effects by means of Monte Carlo simulations. The endpoint for stream organisms was the lethal no‐observed‐effect concentration (NOECs), for groundwater the WHO guideline and the inhibitory Hg concentrations in bacteria (IC), and for humans the tolerable daily intake (TDI) and the benchmark dose level with an uncertainty factor of 10 (BMDLs0.1). Macroinvertebrates and fish in the contaminated river are faced with a higher risk to suffer from exposure to Hg than humans eating contaminated fish and bacteria living in the groundwater. The river sediment is the most hazardous source for the macroinvertebrates, and macroinvertebrates make up the highest risk for fish. The distribution of body concentrations of Hg in fish in the mining areas of the basin may exceed the distribution of endpoint values with close to 100% probability. Similarly, the Hg concentration in cord blood of humans feeding on fish from the river was predicted to exceed the BMDLs0.1 with about 10% probability. Most of the risk to the groundwater quality is confined to the vicinity of the gold refining plants and along the river, with a probability of about 20% to exceed the guideline value.  相似文献   

15.
Prevention of the emergence and spread of foodborne diseases is an important prerequisite for the improvement of public health. Source attribution models link sporadic human cases of a specific illness to food sources and animal reservoirs. With the next generation sequencing technology, it is possible to develop novel source attribution models. We investigated the potential of machine learning to predict the animal reservoir from which a bacterial strain isolated from a human salmonellosis case originated based on whole-genome sequencing. Machine learning methods recognize patterns in large and complex data sets and use this knowledge to build models. The model learns patterns associated with genetic variations in bacteria isolated from the different animal reservoirs. We selected different machine learning algorithms to predict sources of human salmonellosis cases and trained the model with Danish Salmonella Typhimurium isolates sampled from broilers (n = 34), cattle (n = 2), ducks (n = 11), layers (n = 4), and pigs (n = 159). Using cgMLST as input features, the model yielded an average accuracy of 0.783 (95% CI: 0.77–0.80) in the source prediction for the random forest and 0.933 (95% CI: 0.92–0.94) for the logit boost algorithm. Logit boost algorithm was most accurate (valid accuracy: 92%, CI: 0.8706–0.9579) and predicted the origin of 81% of the domestic sporadic human salmonellosis cases. The most important source was Danish produced pigs (53%) followed by imported pigs (16%), imported broilers (6%), imported ducks (2%), Danish produced layers (2%), Danish produced cattle and imported cattle (<1%) while 18% was not predicted. Machine learning has potential for improving source attribution modeling based on sequence data. Results of such models can inform risk managers to identify and prioritize food safety interventions.  相似文献   

16.
Emergency vaccination is an effective control strategy for foot‐and‐mouth disease (FMD) epidemics in densely populated livestock areas, but results in a six‐month waiting period before exports can be resumed, incurring severe economic consequences for pig exporting countries. In the European Union, a one‐month waiting period has been discussed based on negative test results in a final screening. The objective of this study was to analyze the risk of exporting FMD‐infected pig carcasses from a vaccinated area: (1) directly after final screening and (2) after a six‐month waiting period. A risk model has been developed to estimate the probability that a processed carcass was derived from an FMD‐infected pig (Pcarc). Key variables were herd prevalence (PH), within‐herd prevalence (PA), and the probability of detection at slaughter (PSL). PH and PA were estimated using Bayesian inference under the assumption that, despite all negative test results, ≥1 infected pigs were present. Model calculations indicated that Pcarc was on average 2.0 × 10?5 directly after final screening, and 1.7 × 10?5 after a six‐month waiting period. Therefore, the additional waiting time did not substantially reduce Pcarc. The estimated values were worst‐case scenarios because only viraemic pigs pose a risk for disease transmission, while seropositive pigs do not. The risk of exporting FMD via pig carcasses from a vaccinated area can further be reduced by heat treatment of pork and/or by excluding high‐risk pork products from export.  相似文献   

17.
The inclusion of deep tissue lymph nodes (DTLNs) or nonvisceral lymph nodes contaminated with Salmonella in wholesale fresh ground pork (WFGP) production may pose risks to public health. To assess the relative contribution of DTLNs to human salmonellosis occurrence associated with ground pork consumption and to investigate potential critical control points in the slaughter‐to‐table continuum for the control of human salmonellosis in the United States, a quantitative microbial risk assessment (QMRA) model was established. The model predicted an average of 45 cases of salmonellosis (95% CI = [19, 71]) per 100,000 Americans annually due to WFGP consumption. Sensitivity analysis of all stochastic input variables showed that cooking temperature was the most influential parameter for reducing salmonellosis cases associated with WFGP meals, followed by storage temperature and Salmonella concentration on contaminated carcass surface before fabrication. The input variables were grouped to represent three main factors along the slaughter‐to‐table chain influencing Salmonella doses ingested via WFGP meals: DTLN‐related factors, factors at processing other than DTLNs, and consumer‐related factors. The evaluation of the impact of each group of factors by second‐order Monte Carlo simulation showed that DTLN‐related factors had the lowest impact on the risk estimate among the three groups of factors. These findings indicate that interventions to reduce Salmonella contamination in DTLNs or to remove DTLNs from WFGP products may be less critical for reducing human infections attributable to ground pork than improving consumers’ cooking habits or interventions of carcass decontamination at processing.  相似文献   

18.
We analyze the risk of contracting illness due to the consumption in the United States of hamburgers contaminated with enterohemorrhagic Escherichia coli (EHEC) of serogroup O157 produced from manufacturing beef imported from Australia. We have used a novel approach for estimating risk by using the prevalence and concentration estimates of E. coli O157 in lots of beef that were withdrawn from the export chain following detection of the pathogen. For the purpose of the present assessment an assumption was that no product is removed from the supply chain following testing. This, together with a number of additional conservative assumptions, leads to an overestimation of E. coli O157‐associated illness attributable to the consumption of ground beef patties manufactured only from Australian beef. We predict 49.6 illnesses (95%: 0.0–148.6) from the 2.46 billion hamburgers made from 155,000 t of Australian manufacturing beef exported to the United States in 2012. All these illness were due to undercooking in the home and less than one illness is predicted from consumption of hamburgers cooked to a temperature of 68 °C in quick‐service restaurants.  相似文献   

19.
One‐third of the annual cases of listeriosis in the United States occur during pregnancy and can lead to miscarriage or stillbirth, premature delivery, or infection of the newborn. Previous risk assessments completed by the Food and Drug Administration/the Food Safety Inspection Service of the U.S. Department of Agriculture/the Centers for Disease Control and Prevention (FDA/USDA/CDC)( 1 ) and Food and Agricultural Organization/the World Health Organization (FAO/WHO)( 2 ) were based on dose‐response data from mice. Recent animal studies using nonhuman primates( 3 , 4 ) and guinea pigs( 5 ) have both estimated LD50s of approximately 107 Listeria monocytogenes colony forming units (cfu). The FAO/WHO( 2 ) estimated a human LD50 of 1.9 × 106 cfu based on data from a pregnant woman consuming contaminated soft cheese. We reevaluated risk based on dose‐response curves from pregnant rhesus monkeys and guinea pigs. Using standard risk assessment methodology including hazard identification, exposure assessment, hazard characterization, and risk characterization, risk was calculated based on the new dose‐response information. To compare models, we looked at mortality rate per serving at predicted doses ranging from 10?4 to 1012 L. monocytogenes cfu. Based on a serving of 106 L. monocytogenes cfu, the primate model predicts a death rate of 5.9 × 10?1 compared to the FDA/USDA/CDC (fig. IV‐12)( 1 ) predicted rate of 1.3 × 10?7. Based on the guinea pig and primate models, the mortality rate calculated by the FDA/USDA/CDC( 1 ) is underestimated for this susceptible population.  相似文献   

20.
Shiga‐toxin producing Escherichia coli (STEC) strains may cause human infections ranging from simple diarrhea to Haemolytic Uremic Syndrome (HUS). The five main pathogenic serotypes of STEC (MPS‐STEC) identified thus far in Europe are O157:H7, O26:H11, O103:H2, O111:H8, and O145:H28. Because STEC strains can survive or grow during cheese making, particularly in soft cheeses, a stochastic quantitative microbial risk assessment model was developed to assess the risk of HUS associated with the five MPS‐STEC in raw milk soft cheeses. A baseline scenario represents a theoretical worst‐case scenario where no intervention was considered throughout the farm‐to‐fork continuum. The risk level assessed with this baseline scenario is the risk‐based level. The impact of seven preharvest scenarios (vaccines, probiotic, milk farm sorting) on the risk‐based level was expressed in terms of risk reduction. Impact of the preharvest intervention ranges from 76% to 98% of risk reduction with highest values predicted with scenarios combining a decrease of the number of cow shedding STEC and of the STEC concentration in feces. The impact of postharvest interventions on the risk‐based level was also tested by applying five microbiological criteria (MC) at the end of ripening. The five MCs differ in terms of sample size, the number of samples that may yield a value larger than the microbiological limit, and the analysis methods. The risk reduction predicted varies from 25% to 96% by applying MCs without preharvest interventions and from 1% to 96% with combination of pre‐ and postharvest interventions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号