首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Quantitative microbial risk assessment (QMRA) is widely accepted for characterizing the microbial risks associated with food, water, and wastewater. Single‐hit dose‐response models are the most commonly used dose‐response models in QMRA. Denoting as the probability of infection at a given mean dose d, a three‐parameter generalized QMRA beta‐Poisson dose‐response model, , is proposed in which the minimum number of organisms required for causing infection, Kmin, is not fixed, but a random variable following a geometric distribution with parameter . The single‐hit beta‐Poisson model, , is a special case of the generalized model with Kmin = 1 (which implies ). The generalized beta‐Poisson model is based on a conceptual model with greater detail in the dose‐response mechanism. Since a maximum likelihood solution is not easily available, a likelihood‐free approximate Bayesian computation (ABC) algorithm is employed for parameter estimation. By fitting the generalized model to four experimental data sets from the literature, this study reveals that the posterior median estimates produced fall short of meeting the required condition of = 1 for single‐hit assumption. However, three out of four data sets fitted by the generalized models could not achieve an improvement in goodness of fit. These combined results imply that, at least in some cases, a single‐hit assumption for characterizing the dose‐response process may not be appropriate, but that the more complex models may be difficult to support especially if the sample size is small. The three‐parameter generalized model provides a possibility to investigate the mechanism of a dose‐response process in greater detail than is possible under a single‐hit model.  相似文献   

2.
The error estimate of Borgonovo's moment‐independent index is considered, and it shows that the possible computational complexity of is mainly due to the probability density function (PDF) estimate because the PDF estimate is an ill‐posed problem and its convergence rate is quite slow. So it reminds us to compute Borgonovo's index using other methods. To avoid the PDF estimate, , which is based on the PDF, is first approximatively represented by the cumulative distribution function (CDF). The CDF estimate is well posed and its convergence rate is always faster than that of the PDF estimate. From the representation, a stable approach is proposed to compute with an adaptive procedure. Since the small probability multidimensional integral needs to be computed in this procedure, a computational strategy named asymptotic space integration is introduced to reduce a high‐dimensional integral to a one‐dimensional integral. Then we can compute the small probability multidimensional integral by adaptive numerical integration in one dimension with an improved convergence rate. From the comparison of numerical error analysis of some examples, it can be shown that the proposed method is an effective approach to uncertainty importance measure computation.  相似文献   

3.
Microbiological food safety is an important economic and health issue in the context of globalization and presents food business operators with new challenges in providing safe foods. The hazard analysis and critical control point approach involve identifying the main steps in food processing and the physical and chemical parameters that have an impact on the safety of foods. In the risk‐based approach, as defined in the Codex Alimentarius, controlling these parameters in such a way that the final products meet a food safety objective (FSO), fixed by the competent authorities, is a big challenge and of great interest to the food business operators. Process risk models, issued from the quantitative microbiological risk assessment framework, provide useful tools in this respect. We propose a methodology, called multivariate factor mapping (MFM), for establishing a link between process parameters and compliance with a FSO. For a stochastic and dynamic process risk model of in soft cheese made from pasteurized milk with many uncertain inputs, multivariate sensitivity analysis and MFM are combined to (i) identify the critical control points (CCPs) for throughout the food chain and (ii) compute the critical limits of the most influential process parameters, located at the CCPs, with regard to the specific process implemented in the model. Due to certain forms of interaction among parameters, the results show some new possibilities for the management of microbiological hazards when a FSO is specified.  相似文献   

4.
Few studies have focused on the different roles risk factors play in the multistate temporal natural course of breast cancer. We proposed a three‐state Markov regression model to predict the risk from free of breast cancer (FBC) to the preclinical screen‐detectable phase (PCDP) and from the PCDP to the clinical phase (CP). We searched the initiators and promoters affecting onset and subsequent progression of breast tumor to build up a three‐state temporal natural history model with state‐dependent genetic and environmental covariates. This risk assessment model was applied to a 1 million Taiwanese women cohort. The proposed model was verified by external validation with another independent data set. We identified three kinds of initiators, including the BRCA gene, seven single nucleotides polymorphism, and breast density. ER, Ki‐67, and HER‐2 were found as promoters. Body mass index and age at first pregnancy both played a role. Among women carrying the BRCA gene, the 10‐year predicted risk for the transition from FBC to CP was 25.83%, 20.31%, and 13.84% for the high‐, intermediate‐, and low‐risk group, respectively. The corresponding figures were 1.55%, 1.22%, and 0.76% among noncarriers. The mean sojourn time of staying at the PCDP ranged from 0.82 years for the highest risk group to 6.21 years for the lowest group. The lack of statistical significance for external validation () revealed the adequacy of our proposed model. The three‐state model with state‐dependent covariates of initiators and promoters was proposed for achieving individually tailored screening and also for personalized clinical surveillance of early breast cancer.  相似文献   

5.
Exposure assessment for food and drink consumption requires the combining of information about people's consumption of products with concentration data sets to provide predictions for chemical intake by humans. In this article, we present a method called nonparametric predictive inference (NPI) for exposure assessment. NPI is a distribution‐free method relying only on Hill's assumption . Effectively, is a postdata exchangeability assumption, which is a natural starting point for nonparametric statistics. For further discussion we refer to works by Hill and Coolen. We illustrate how NPI can be implemented to produce predictions for an individual's exposure based on consumption, body weight, and concentration data. NPI has the advantage that we do not have to assume a distribution to implement it. There may, however, be information available to suggest a distribution for a random quantity. Therefore, we present an NPI‐Bayes hybrid method where this information can be taken into account by using Bayesian methods while using NPI for the other random quantities in the model.  相似文献   

6.
Hormesis refers to a nonmonotonic (biphasic) dose–response relationship in toxicology, environmental science, and related fields. In the presence of hormesis, a low dose of a toxic agent may have a lower risk than the risk at the control dose, and the risk may increase at high doses. When the sample size is small due to practical, logistic, and ethical considerations, a parametric model may provide an efficient approach to hypothesis testing at the cost of adopting a strong assumption, which is not guaranteed to be true. In this article, we first consider alternative parameterizations based on the traditional three‐parameter logistic regression. The new parameterizations attempt to provide robustness to model misspecification by allowing an unspecified dose–response relationship between the control dose and the first nonzero experimental dose. We then consider experimental designs including the uniform design (the same sample size per dose group) and the c ‐optimal design (minimizing the standard error of an estimator for a parameter of interest). Our simulation studies showed that (1) the c ‐optimal design under the traditional three‐parameter logistic regression does not help reducing an inflated Type I error rate due to model misspecification, (2) it is helpful under the new parameterization with three parameters (Type I error rate is close to a fixed significance level), and (3) the new parameterization with four parameters and the c ‐optimal design does not reduce statistical power much while preserving the Type I error rate at a fixed significance level.  相似文献   

7.
This article models flood occurrence probabilistically and its risk assessment. It incorporates atmospheric parameters to forecast rainfall in an area. This measure of precipitation, together with river and ground parameters, serve as parameters in the model to predict runoff and subsequently inundation depth of an area. The inundation depth acts as a guide for predicting flood proneness and associated hazard. The vulnerability owing to flood has been analyzed as social vulnerability ( V S ) , vulnerability to property ( V P ) , and vulnerability to the location in terms of awareness ( V A ) . The associated risk has been estimated for each area. The distribution of risk values can be used to classify every area into one of the six risk zones—namely, very low risk, low risk, moderately low risk, medium risk, high risk, and very high risk. The prioritization regarding preparedness, evacuation planning, or distribution of relief items should be guided by the range on the risk scale within which the area under study falls. The flood risk assessment model framework has been tested on a real‐life case study. The flood risk indices for each of the municipalities in the area under study have been calculated. The risk indices and hence the flood risk zone under which a municipality is expected to lie would alter every day. The appropriate authorities can then plan ahead in terms of preparedness to combat the impending flood situation in the most critical and vulnerable areas.  相似文献   

8.
9.
This article investigates how different decisions can be reached when decision makers consult a binary rating system and a scale rating system. Since typically decision makers use rating information to make binary decisions, it is particularly important to compare the scale system to the binary system. We show that the only N‐point scale system that reports a rater's opinion consistently with the binary system is one where N is odd and is not divisible by 4. At the aggregate level, however, we illustrate that inconsistencies persist regardless of the choice of N. In addition, we provide simple tools that can determine whether the systems lead decision makers to the same decision outcomes.  相似文献   

10.
We study a two‐product inventory model that allows substitution. Both products can be used to supply demand over a selling season of N periods, with a one‐time replenishment opportunity at the beginning of the season. A substitution could be offered even when the demanded product is available. The substitution rule is flexible in the sense that the seller can choose whether or not to offer substitution and at what price or discount level, and the customer may or may not accept the offer, with the acceptance probability being a decreasing function of the substitution price. The decisions are the replenishment quantities at the beginning of the season, and the dynamic substitution‐pricing policy in each period of the season. Using a stochastic dynamic programming approach, we present a complete solution to the problem. Furthermore, we show that the objective function is concave and submodular in the inventory levels—structural properties that facilitate the solution procedure and help identify threshold policies for the optimal substitution/pricing decisions. Finally, with a state transformation, we also show that the objective function is ‐concave, which allows us to derive similar structural properties of the optimal policy for multiple‐season problems.  相似文献   

11.
Firms often determine whether or not to make components common across products by focusing on the manufacturing and sales of new products only. However, component commonality decisions that ignore remanufacturing can adversely affect the profitability of the firm. In this article we analyze how remanufacturing could reverse the OEM's commonality decision that is based on the manufacturing and sales of new products only. Specifically, we determine the conditions under which the OEM's optimal decision on commonality may be reversed and illustrate how her profit can be significantly higher if remanufacturing is taken into account ex ante. We illustrate the implementation of our model for two products in the Apple iPad family.  相似文献   

12.
In recent years, there have been growing concerns regarding risks in federal information technology (IT) supply chains in the United States that protect cyber infrastructure. A critical need faced by decisionmakers is to prioritize investment in security mitigations to maximally reduce risks in IT supply chains. We extend existing stochastic expected budgeted maximum multiple coverage models that identify “good” solutions on average that may be unacceptable in certain circumstances. We propose three alternative models that consider different robustness methods that hedge against worst‐case risks, including models that maximize the worst‐case coverage, minimize the worst‐case regret, and maximize the average coverage in the ( 1 ? α ) worst cases (conditional value at risk). We illustrate the solutions to the robust methods with a case study and discuss the insights their solutions provide into mitigation selection compared to an expected‐value maximizer. Our study provides valuable tools and insights for decisionmakers with different risk attitudes to manage cybersecurity risks under uncertainty.  相似文献   

13.
Many firms make significant investments into developing and managing knowledge within their supply chains. Such investments are often prudent because studies indicate that supply chain knowledge (SCK) has a positive influence on performance. Key questions still surround the SCK–performance relationship, however. First, what is the overall relationship between SCK and performance? Second, under what conditions is the relationship stronger or weaker? To address these questions, we applied meta‐analysis to 35 studies of the SCK–performance relationship that collectively include more than 8,400 firms. Our conservative estimate is that the effect size of the overall relationship is  = .39. We also find that the SCK–performance relationship is stronger when (i) examining operational performance, (ii) gathering data from more than one supply chain node, (iii) gathering data from multiple countries, (iv) examining service industries, and (v) among more recently published studies. We also found that studies that embraced a single theory base (as opposed to using multiple ones) had a stronger SCK–performance relationship. Looking to the future, our meta‐analysis highlights the need for studies to (i) include lags between the measurement of SCK and performance, (ii) gather upstream data when examining innovation, (iii) examine SCK within emerging countries, and (iv) provide much more information relative to the nuances of the SCK examined.  相似文献   

14.
We study an inventory management mechanism that uses two stochastic programs (SPs), the customary one‐period assemble‐to‐order (ATO) model and its relaxation, to conceive control policies for dynamic ATO systems. We introduce a class of ATO systems, those that possess what we call a “chained BOM.” We prove that having a chained BOM is a sufficient condition for both SPs to be convex in the first‐stage decision variables. We show by examples the necessity of the condition. For ATO systems with a chained BOM, our result implies that the optimal integer solutions of the SPs can be found efficiently, and thus expedites the calculation of control parameters. The M system is a representative chained BOM system with two components and three products. We show that in this special case, the SPs can be solved as a one‐stage optimization problem. The allocation policy can also be reduced to simple, intuitive instructions, of which there are four distinct sets, one for each of four different parameter regions. We highlight the need for component reservation in one of these four regions. Our numerical studies demonstrate that achieving asymptotic optimality represents a significant advantage of the SP‐based approach over alternative approaches. Our numerical comparisons also show that outside of the asymptotic regime, the SP‐based approach has a commanding lead over the alternative policies. Our findings indicate that the SP‐based approach is a promising inventory management strategy that warrants further development for more general systems and practical implementations.  相似文献   

15.
Matthew Revie 《Risk analysis》2011,31(7):1120-1132
Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data‐dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with , where n is the number of trials.  相似文献   

16.
We superimpose a radiation fallout model onto a traffic flow model to assess the evacuation versus shelter‐in‐place decisions after the daytime ground‐level detonation of a 10‐kt improvised nuclear device in Washington, DC. In our model, ≈80k people are killed by the prompt effects of blast, burn, and radiation. Of the ≈360k survivors without access to a vehicle, 42.6k would die if they immediately self‐evacuated on foot. Sheltering above ground would save several thousand of these lives and sheltering in a basement (or near the middle of a large building) would save of them. Among survivors of the prompt effects with access to a vehicle, the number of deaths depends on the fraction of people who shelter in a basement rather than self‐evacuate in their vehicle: 23.1k people die if 90% shelter in a basement and 54.6k die if 10% shelter. Sheltering above ground saves approximately half as many lives as sheltering in a basement. The details related to delayed (i.e., organized) evacuation, search and rescue, decontamination, and situational awareness (via, e.g., telecommunications) have very little impact on the number of casualties. Although antibiotics and transfusion support have the potential to save ≈10k lives (and the number of lives saved from medical care increases with the fraction of people who shelter in basements), the logistical challenge appears to be well beyond current response capabilities. Taken together, our results suggest that the government should initiate an aggressive outreach program to educate citizens and the private sector about the importance of sheltering in place in a basement for at least 12 hours after a terrorist nuclear detonation.  相似文献   

17.
In this paper we consider the multidimensional binary vector assignment problem. An input of this problem is defined by m disjoint multisets \(V^1, V^2, \ldots , V^m\), each composed of n binary vectors of size p. An output is a set of n disjoint m-tuples of vectors, where each m-tuple is obtained by picking one vector from each multiset \(V^i\). To each m-tuple we associate a p dimensional vector by applying the bit-wise AND operation on the m vectors of the tuple. The objective is to minimize the total number of zeros in these n vectors. We denote this problem by Open image in new window , and the restriction of this problem where every vector has at most c zeros by Open image in new window . Open image in new window was only known to be Open image in new window -hard, even for Open image in new window . We show that, assuming the unique games conjecture, it is Open image in new window -hard to Open image in new window -approximate Open image in new window for any fixed Open image in new window and Open image in new window . This result is tight as any solution is a Open image in new window -approximation. We also prove without assuming UGC that Open image in new window is Open image in new window -hard even for Open image in new window . Finally, we show that Open image in new window is polynomial-time solvable for fixed Open image in new window (which cannot be extended to Open image in new window ).  相似文献   

18.
The increasing development of autonomous vehicles (AVs) influences the future of transportation. Beyond the potential benefits in terms of safety, efficiency, and comfort, also potential risks of novel driving technologies need to be addressed. In this article, we explore risk perceptions toward connected and autonomous driving in comparison to conventional driving. In order to gain a deeper understanding of individual risk perceptions, we adopted a two‐step empirical procedure. First, focus groups ( N = 17 ) were carried out to identify relevant risk factors for autonomous and connected driving. Further, a questionnaire was developed, which was answered by 516 German participants. In the questionnaire, three driving technologies (connected, autonomous, conventional) were evaluated via semantic differential (rating scale to identify connotative meaning of technologies). Second, participants rated perceived risk levels (for data, traffic environment, vehicle, and passenger) and perceived benefits and barriers of connected/autonomous driving. Since previous experience with automated functions of driver assistance systems can have an impact on the evaluation, three experience groups have been formed. The effect of experience on benefits and barrier perceptions was also analyzed. Risk perceptions were significantly smaller for conventional driving compared to connected/autonomous driving. With increasing experience, risk perception decreases for novel driving technologies with one exception: the perceived risk in handling data is not influenced by experience. The findings contribute to an understanding of risk perception in autonomous driving, which helps to foster a successful implementation of AVs on the market and to develop public information strategies.  相似文献   

19.
In an inflationary economy of declinin R
D expenditures, effective transfer of innovative ideas from the literature to the laboratory can significantly complement an R
D budget. This paper discusses an efective library management organization and practical mechanisms for improving technology transfer from the corporate library. Experience at Sanders Associates Inc. with the techniques presented had been most favourable. Substantial savings in R
D dollars have been realized.  相似文献   

20.
Detecting abnormal events is one of the fundamental issues in wireless sensor networks (WSNs). In this paper, we investigate \((\alpha ,\tau )\)-monitoring in WSNs. For a given monitored threshold \(\alpha \), we prove that (i) the tight upper bound of \(\Pr [{S(t)} \ge \alpha ]\) is \(O\left( {\exp \left\{ { - n\ell \left( {\frac{\alpha }{{nsup}},\frac{{\mu (t)}}{{nsup}}} \right) } \right\} } \right) \), if \(\mu (t) < \alpha \); and (ii) the tight upper bound of \(\Pr [{S(t)} \le \alpha ]\) is \(O\left( {\exp \left\{ { - n\ell \left( {\frac{\alpha }{{nsup}},\frac{{\mu (t)}}{{nsup}}} \right) } \right\} } \right) \), if \(\mu (t) > \alpha \), where \(\Pr [X]\) is the probability of random event \(X,\, S(t)\) is the sum of the monitored area at time \(t,\, n\) is the number of the sensor nodes, \(sup\) is the upper bound of sensed data, \( \mu (t)\) is the expectation of \(S(t)\), and \(\ell ({x_1},{x_2}) = {x_1}\ln \left( {\frac{{{x_1}}}{{{x_2}}}} \right) + (1 - {x_1})\ln \left( {\frac{{1 - {x_1}}}{{1 - {x_2}}}} \right) \). An instant \((\alpha ,\tau )\)-monitoring scheme is then developed based on the upper bound. Moreover, approximate continuous \((\alpha , \tau )\)-monitoring is investigated. We prove that the probability of false negative alarm is \(\delta \), if the sample size is Open image in new window for a given precision requirement, where Open image in new window is the Open image in new window fractile of a standard normal distribution. Finally, the performance of the proposed algorithms is validated through experiments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号