首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Priority setting for food safety management at a national level requires risks to be ranked according to defined criteria. In this study, two approaches (disability‐adjusted life years (DALYs) and cost of illness (COI)) were used to generate estimates of the burden of disease for certain potentially foodborne diseases (campylobacteriosis, salmonellosis, listeriosis (invasive, perinatal, and nonperinatal), infection with Shiga toxin‐producing Escherichia coli (STEC), yersiniosis, and norovirus infection) and their sequelae in New Zealand. A modified Delphi approach was used to estimate the food‐attributable proportion for these diseases. The two approaches gave a similar ranking for the selected diseases, with campylobacteriosis and its sequelae accounting for the greatest proportion of the overall burden of disease by far.  相似文献   

2.
Root cause analysis can be used in foodborne illness outbreak investigations to determine the underlying causes of an outbreak and to help identify actions that could be taken to prevent future outbreaks. We developed a new tool, the Quantitative Risk Assessment-Epidemic Curve Prediction Model (QRA-EC), to assist with these goals and applied it to a case study to investigate and illustrate the utility of leveraging quantitative risk assessment to provide unique insights for foodborne illness outbreak root cause analysis. We used a 2019 Salmonella outbreak linked to melons as a case study to demonstrate the utility of this model (Centers for Disease Control and Prevention [CDC], 2019). The model was used to evaluate the impact of various root cause hypotheses (representing different contamination sources and food safety system failures in the melon supply chain) on the predicted number and timeline of illnesses. The predicted number of illnesses varied by contamination source and was strongly impacted by the prevalence and level of Salmonella contamination on the surface/inside of whole melons and inside contamination niches on equipment surfaces. The timeline of illnesses was most strongly impacted by equipment sanitation efficacy for contamination niches. Evaluations of a wide range of scenarios representing various potential root causes enabled us to identify which hypotheses, were likely to result in an outbreak of similar size and illness timeline to the 2019 Salmonella melon outbreak. The QRA-EC framework can be adapted to accommodate any food–pathogen pairs to provide insights for foodborne outbreak investigations.  相似文献   

3.
ABSTRACT

Current frameworks of leadership are based on face-to-face interaction. A growing number of workers work away from their main location of work; this makes it challenging for leaders to ensure the health and safety of distributed workers. In the present study, we explore the relationship between line managers’ health and safety leadership and distributed workers’ health and safety behaviours. We also explore the organisational procedures and practices that may enhance the impact of health and safety leadership. We included a broad range of distributed workers (in analyses, minimum N?=?626) from 11 organisations. We found that health-and-safety-specific leadership was positively related to distributed workers’ self-rated health, safety compliance and safety proactivity. These relationships were augmented by distributed workers’ sense of being included in the workplace. Knowledge sharing among colleagues was associated with safety compliance when health-and-safety-specific leadership was low. Our results indicate that one way of addressing the challenges of distributed working may be through line managers putting health and safety on the agenda.  相似文献   

4.
A number of OECD countries aim to encourage work integration of disabled persons using quota policies. For instance, Austrian firms must provide at least one job to a disabled worker per 25 nondisabled workers and are subject to a tax if they do not. This “threshold design” provides causal estimates of the noncompliance tax on disabled employment if firms do not manipulate nondisabled employment; a lower and upper bound on the causal effect can be constructed if they do. Results indicate that firms with 25 nondisabled workers employ about 0.04 (or 12%) more disabled workers than without the tax; firms do manipulate employment of nondisabled workers but the lower bound on the employment effect of the quota remains positive; employment effects are stronger in low‐wage firms than in high‐wage firms; and firms subject to the quota of two disabled workers or more hire 0.08 more disabled workers per additional quota job. Moreover, increasing the noncompliance tax increases excess disabled employment, whereas paying a bonus to overcomplying firms slightly dampens the employment effects of the tax.  相似文献   

5.
Thomas Oscar 《Risk analysis》2021,41(1):110-130
Salmonella is a leading cause of foodborne illness (i.e., salmonellosis) outbreaks, which on occasion are attributed to ground turkey. The poultry industry uses Salmonella prevalence as an indicator of food safety. However, Salmonella prevalence is only one of several factors that determine risk of salmonellosis. Consequently, a model for predicting risk of salmonellosis from individual lots of ground turkey as a function of Salmonella prevalence and other risk factors was developed. Data for Salmonella contamination (prevalence, number, and serotype) of ground turkey were collected at meal preparation. Scenario analysis was used to evaluate effects of model variables on risk of salmonellosis. Epidemiological data were used to simulate Salmonella serotype virulence in a dose‐response model that was based on human outbreak and feeding trial data. Salmonella prevalence was 26% (n = 100) per 25 g of ground turkey, whereas Salmonella number ranged from 0 to 1.603 with a median of 0.185 log per 25 g. Risk of salmonellosis (total arbitrary units (AU) per lot) was affected (p ≤ 0.05) by Salmonella prevalence, number, and virulence, by incidence and extent of undercooking, and by food consumption behavior and host resistance but was not (p > 0.05) affected by serving size, serving size distribution, or total bacterial load of ground turkey when all other risk factors were held constant. When other risk factors were not held constant, Salmonella prevalence was not correlated (r = ?0.39; p = 0.21) with risk of salmonellosis. Thus, Salmonella prevalence alone was not a good indicator of poultry food safety because other factors were found to alter risk of salmonellosis. In conclusion, a more holistic approach to poultry food safety, such as the process risk model developed in the present study, is needed to better protect public health from foodborne pathogens like Salmonella.  相似文献   

6.
Felicia Wu    Joseph V. Rodricks 《Risk analysis》2020,40(Z1):2218-2230
Before the founding of the Society for Risk Analysis (SRA) in 1980, food safety in the United States had long been a concern, but there was a lack of systematic methods to assess food-related risks. In 1906, the U.S. Congress passed, and President Roosevelt signed, the Pure Food and Drug Act and the Meat Inspection Act to regulate food safety at the federal level. This Act followed the publication of multiple reports of food contamination, culminating in Upton Sinclair's novel The Jungle, which highlighted food and worker abuses in the meatpacking industry. Later in the 20th century, important developments in agricultural and food technology greatly increased food production. But chemical exposures from agricultural and other practices resulted in major amendments to federal food laws, including the Delaney Clause, aimed specifically at cancer-causing chemicals. Later in the 20th century, when quantitative risk assessment methods were given greater scientific status in a seminal National Research Council report, food safety risk assessment became more systematized. Additionally, in these last 40 years, food safety research has resulted in increased understanding of a range of health effects from foodborne chemicals, and technological developments have improved U.S. food safety from farm to fork by offering new ways to manage risks. We discuss the history of food safety and the role risk analysis has played in its evolution, starting from over a century ago, but focusing on the last 40 years. While we focus on chemical risk assessment in the U.S., we also discuss microbial risk assessment and international food safety.  相似文献   

7.
The rise in obesity has largely been attributed to an increase in calorie consumption. We show that official government household survey data indicate that levels of calorie consumption have declined in England between 1980 and 2013; while there has been an increase in calories from food eaten out at restaurants, fast food, soft drinks and confectionery, overall there has been a decrease in total calories purchased. Households have shifted towards more expensive calories, both by substituting away from home production towards market production, and substituting towards higher quality foods. We show that the decline in calories can be partially, but not entirely, rationalized with weight gain by a decline in the strenuousness of work and daily life. (JEL: D12, I12, I18)  相似文献   

8.
To reduce consumer health risks from foodborne diseases that result from improper domestic food handling, consumers need to know how to safely handle food. To realize improvements in public health, it is necessary to develop interventions that match the needs of individual consumers. Successful intervention strategies are therefore contingent on identifying not only the practices that are important for consumer protection, but also barriers that prevent consumers from responding to these interventions. A measure of food safety behavior is needed to assess the effectiveness of different intervention strategies across different groups of consumers. A nationally representative survey was conducted in the Netherlands to determine which practices are likely conducted by which consumers. Participants reported their behaviors with respect to 55 different food-handling practices. The Rasch modeling technique was used to determine a general measure for the likelihood of an average consumer performing each food-handling behavior. Simultaneously, an average performance measure was estimated for each consumer. These two measures can be combined to predict the likelihood that an individual consumer engages in a specific food-handling behavior. A single "food safety" dimension was shown to underlie all items. Some potentially safe practices (e.g., use of meat thermometers) were reported as very difficult, while other safe practices were conducted by respondents more frequently (e.g., washing of fresh fruit and vegetables). A cluster analysis was applied to the resulting data set, and five segments of consumers were identified. Different behaviors may have different effects on microbial growth in food, and thus have different consequences for human health. Once the microbial relevance of the different consumer behaviors has been confirmed by experiments and modeling, the scale developed in the research reported here can be used to develop risk communication targeted to the needs of different consumer groups, as well as to measure the efficacy of different interventions.  相似文献   

9.
We developed a quantitative risk assessment model using a discrete event framework to quantify and study the risk associated with norovirus transmission to consumers through food contaminated by infected food employees in a retail food setting. This study focused on the impact of ill food workers experiencing symptoms of diarrhea and vomiting and potential control measures for the transmission of norovirus to foods. The model examined the behavior of food employees regarding exclusion from work while ill and after symptom resolution and preventive measures limiting food contamination during preparation. The mean numbers of infected customers estimated for 21 scenarios were compared to the estimate for a baseline scenario representing current practices. Results show that prevention strategies examined could not prevent norovirus transmission to food when a symptomatic employee was present in the food establishment. Compliance with exclusion from work of symptomatic food employees is thus critical, with an estimated range of 75–226% of the baseline mean for full to no compliance, respectively. Results also suggest that efficient handwashing, handwashing frequency associated with gloving compliance, and elimination of contact between hands, faucets, and door handles in restrooms reduced the mean number of infected customers to 58%, 62%, and 75% of the baseline, respectively. This study provides quantitative data to evaluate the relative efficacy of policy and practices at retail to reduce norovirus illnesses and provides new insights into the interactions and interplay of prevention strategies and compliance in reducing transmission of foodborne norovirus.  相似文献   

10.
This paper explores the food safety implications of insurance products that compensate for business losses when food contamination causes a processing firm to initiate a recall. Discoveries of meat and poultry product contamination, in particular life-threatening pathogens, are increasing. The financial losses that follow a recall can be substantial as illustrated by several recent U.S. cases—Hudson Foods, Bil Mar, and Thorn Apple Valley Inc. Additionally, contaminated food product that escapes the current recall system poses a threat to consumer safety. The conceptual analysis presented here suggests that insurance underwriters could motivate earlier recalls and more diligent implementation of Hazard Analysis and Critical Control Point (HACCP). With sound underwriting, these changes could ultimately reduce the incidence of illness and death from foodborne pathogens.  相似文献   

11.
Abstract

This study aims to examine whether the relationship between overtime and well-being is influenced by the voluntary vs. involuntary (i.e., compulsory) nature of overtime work and by the presence or absence of rewards for overtime. We also explored the prevalence of these types of overtime and how they were related to work and personal characteristics. A survey was conducted among a representative sample of Dutch full-time employees (N=1612). AN(C)OVA was used to compare rewarded and unrewarded, voluntary and involuntary overtime workers on personal and work characteristics, fatigue, and work satisfaction. Most overtime workers were rewarded (62%). About half of the sample (n=814) could be classified as either voluntary or involuntary overtime workers, or as having “mixed reasons” to work overtime. Voluntary and unrewarded overtime workers had a relatively high income and favourable job characteristics. Involuntary overtime work was associated with relatively high fatigue and low satisfaction, especially for involuntary overtime workers without rewards who can be considered a burnout risk group. Voluntary overtime workers were non-fatigued and satisfied, even without rewards. It can be concluded that control over overtime and rewards for overtime are important for well-being. Moderate overtime work may not be a problem if it is done voluntarily. Moreover, the negative effects of compulsory overtime work may be partly offset by fair compensation for the extra work.  相似文献   

12.
Food safety objectives (FSOs) are established in order to minimize the risk of foodborne illnesses to consumers, but these have not yet been incorporated into regulatory policy. An FSO states the maximum frequency and/or concentration of a microbiological hazard in a food at the time of consumption that provides an acceptable level of protection to the public and leads to a performance criterion for industry. However, in order to be implemented as a regulation, this criterion has to be achievable by the affected industry. In order to determine an FSO, the steps to produce and store that food need to be known, especially where they have an impact on contamination, growth, and destruction. This article uses existing models for growth of Listeria monocytogenes in conjunction with calculations of FSOs to approximate the outcome of more than one introduction of the foodborne organism throughout the food-processing path from the farm to the consumer. Most models for the growth and reduction of foodborne illnesses are logarithmic in nature, which fits the nature of the growth of microorganisms, spanning many orders of magnitude. However, these logarithmic models are normally limited to a single introduction step and a single reduction step. The model presented as part of this research addresses more than one introduction of food contamination, each of which can be separated by a substantial amount of time. The advantage of treating the problem this way is the accommodation of multiple introductions of foodborne pathogens over a range of time durations and conditions.  相似文献   

13.
Perceptions of institutions that manage hazards are important because they can affect how the public responds to hazard events. Antecedents of trust judgments have received far more attention than antecedents of attributions of responsibility for hazard events. We build upon a model of retrospective attribution of responsibility to individuals to examine these relationships regarding five classes of institutions that bear responsibility for food safety: producers (e.g., farmers), processors (e.g., packaging firms), watchdogs (e.g., government agencies), sellers (e.g., supermarkets), and preparers (e.g., restaurants). A nationally representative sample of 1,200 American adults completed an Internet‐based survey in which a hypothetical scenario involving contamination of diverse foods with Salmonella served as the stimulus event. Perceived competence and good intentions of the institution moderately decreased attributions of responsibility. A stronger factor was whether an institution was deemed (potentially) aware of the contamination and free to act to prevent or mitigate it. Responsibility was rated higher the more aware and free the institution. This initial model for attributions of responsibility to impersonal institutions (as opposed to individual responsibility) merits further development.  相似文献   

14.
Increasing evidence suggests that persistence of Listeria monocytogenes in food processing plants has been the underlying cause of a number of human listeriosis outbreaks. This study extracts criteria used by food safety experts in determining bacterial persistence in the environment, using retail delicatessen operations as a model. Using the Delphi method, we conducted an expert elicitation with 10 food safety experts from academia, industry, and government to classify L. monocytogenes persistence based on environmental sampling results collected over six months for 30 retail delicatessen stores. The results were modeled using variations of random forest, support vector machine, logistic regression, and linear regression; variable importance values of random forest and support vector machine models were consolidated to rank important variables in the experts’ classifications. The duration of subtype isolation ranked most important across all expert categories. Sampling site category also ranked high in importance and validation errors doubled when this covariate was removed. Support vector machine and random forest models successfully classified the data with average validation errors of 3.1% and 2.2% (n = 144), respectively. Our findings indicate that (i) the frequency of isolations over time and sampling site information are critical factors for experts determining subtype persistence, (ii) food safety experts from different sectors may not use the same criteria in determining persistence, and (iii) machine learning models have potential for future use in environmental surveillance and risk management programs. Future work is necessary to validate the accuracy of expert and machine classification against biological measurement of L. monocytogenes persistence.  相似文献   

15.
Non-work social media use at work has seen a dramatic increase in the last decade and is commonly deemed counterproductive work behaviour. However, we examined whether it may also serve as a micro-break and improve work engagement. We used ecological momentary assessment across 1 working day with up to 10 hourly measurements in 334 white-collar workers to measure non-work social media use and work engagement, resulting in 2235 hourly measurements. Multilevel modelling demonstrated that non-work social media use was associated with lower levels of work engagement between persons. Within persons, non-work social media use was also associated with lower concurrent work engagement. However, non-work social media use was related to higher levels of work engagement 1 hour later. While more extensive non-work social media use at work was generally associated with lower work engagement, our advanced study design revealed that the longer employees used social media for non-work purposes during 1 working hour, the more work engaged they were in the subsequent working hour, suggesting that employees turn to social media when energy levels are low and/or when they (temporarily) lose interest in their work. This behaviour may serve as a break, which in turn increases work engagement later during the day.  相似文献   

16.
A recent paper by Ferrier and Buzby provides a framework for selecting the sample size when testing a lot of beef trim for Escherichia coli O157:H7 that equates the averted costs of recalls and health damages from contaminated meats sold to consumers with the increased costs of testing while allowing for uncertainty about the underlying prevalence of contamination. Ferrier and Buzby conclude that the optimal sample size is larger than the current sample size. However, Ferrier and Buzby's optimization model has a number of errors, and their simulations failed to consider available evidence about the likelihood of the scenarios explored under the model. After correctly modeling microbial prevalence as dependent on portion size and selecting model inputs based on available evidence, the model suggests that the optimal sample size is zero under most plausible scenarios. It does not follow, however, that sampling beef trim for E. coli O157:H7, or food safety sampling more generally, should be abandoned. Sampling is not generally cost effective as a direct consumer safety control measure due to the extremely large sample sizes required to provide a high degree of confidence of detecting very low acceptable defect levels. Food safety verification sampling creates economic incentives for food producing firms to develop, implement, and maintain effective control measures that limit the probability and degree of noncompliance with regulatory limits or private contract specifications.  相似文献   

17.
It has been established that, to a considerable extent, the domestic hygiene practices adopted by consumers can result in a greater or lesser microbial load in prepared meals. In the research presented here, an interdisciplinary study is reported in which interviews, observations of consumers preparing a recipe, and microbial contamination of the finished meals were compared. The results suggest that, while most consumers are knowledgeable about the importance of cross-contamination and heating in preventing the occurrence of foodborne illness, this knowledge is not necessarily translated into behavior. The adoption of habitual cooking practices may also be important. Potentially risky behaviors were, indeed, observed in the domestic food preparation environment. Eighteen of the participants made errors in food preparation that could potentially result in cross-contamination, and seven participants allowed raw meat juices to come in contact with the final meal. Using a tracer microorganism the log reduction as a result of consumer preparation was estimated at an average of log 4.1 cfu/salad. When combining these findings, it was found that cross-contamination errors were a good predictor for log reduction. Procedural food safety knowledge (i.e., knowledge proffered after general open questions) was a better predictor of efficacious bacterial reduction than declarative food safety knowledge (i.e., knowledge proffered after formal questioning). This suggests that motivation to prepare safe food was a better indicator of actual behavior than knowledge about food safety per se.  相似文献   

18.
Prevention of the emergence and spread of foodborne diseases is an important prerequisite for the improvement of public health. Source attribution models link sporadic human cases of a specific illness to food sources and animal reservoirs. With the next generation sequencing technology, it is possible to develop novel source attribution models. We investigated the potential of machine learning to predict the animal reservoir from which a bacterial strain isolated from a human salmonellosis case originated based on whole-genome sequencing. Machine learning methods recognize patterns in large and complex data sets and use this knowledge to build models. The model learns patterns associated with genetic variations in bacteria isolated from the different animal reservoirs. We selected different machine learning algorithms to predict sources of human salmonellosis cases and trained the model with Danish Salmonella Typhimurium isolates sampled from broilers (n = 34), cattle (n = 2), ducks (n = 11), layers (n = 4), and pigs (n = 159). Using cgMLST as input features, the model yielded an average accuracy of 0.783 (95% CI: 0.77–0.80) in the source prediction for the random forest and 0.933 (95% CI: 0.92–0.94) for the logit boost algorithm. Logit boost algorithm was most accurate (valid accuracy: 92%, CI: 0.8706–0.9579) and predicted the origin of 81% of the domestic sporadic human salmonellosis cases. The most important source was Danish produced pigs (53%) followed by imported pigs (16%), imported broilers (6%), imported ducks (2%), Danish produced layers (2%), Danish produced cattle and imported cattle (<1%) while 18% was not predicted. Machine learning has potential for improving source attribution modeling based on sequence data. Results of such models can inform risk managers to identify and prioritize food safety interventions.  相似文献   

19.
Feminism is a theoretical perspective and social movement that seeks to reduce, and ultimately eradicate, sexist inequality and oppression. Yet feminist research remains marginal in the most prestigious management and organization studies (MOS) journals, as defined by the Financial Times 50 (FT50) list. Based on a review of how feminism is framed in these journals (1990–2018), we identify three overlapping categories of how feminism is represented: (i) as a conceptual resource which is used to address specific topics; (ii) as an empirical category associated with the study of specific types of organization or organizing practice; and, rarely, (iii) as a methodology for producing knowledge. While feminist knowledge exists beyond these parameters, such as in the journal Gender, Work & Organization, we suggest that the relative absence of explicitly feminist scholarship in the most prestigious MOS journals reflects an epistemic oppression which arises from the threat that feminism presents to established ways of knowing. Drawing on Sara Ahmed's work, we use the ‘sweaty concept’ of dangerous knowledge to show how feminism positions knowledge as personal, introducing a radical form of researcher subjectivity which relies on the acknowledgment of uncertainty. We conclude by calling for the epistemic oppression of feminist scholarship to be recognized and redressed so the potential of feminism as a way of knowing about organizations and management can be realized. This, we argue, would enable feminist research praxis in MOS to develop as an alternative location of, in bell hooks' term, healing that challenges the main/malestream.  相似文献   

20.
Certification is an essential feature in organic farming, and it is based on inspections to verify compliance with respect to European Council Regulation—EC Reg. No 834/2007. A risk‐based approach to noncompliance that alerts the control bodies to activate planning inspections would contribute to a more efficient and cost‐effective certification system. An analysis of factors that can affect the probability of noncompliance in organic farming has thus been developed. This article examines the application of zero‐inflated count data models to farm‐level panel data from inspection results and sanctions obtained from the Ethical and Environmental Certification Institute, one of the main control bodies in Italy. We tested many a priori hypotheses related to the risk of noncompliance. We find evidence of an important role for past noncompliant behavior in predicting future noncompliance, while farm size and the occurrence of livestock also have roles in an increased probability of noncompliance. We conclude the article proposing that an efficient risk‐based inspection system should be designed, weighting up the known probability of occurrence of a given noncompliance according to the severity of its impact.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号