首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 265 毫秒
1.
Paul H Randolph 《Omega》1976,4(4):463-477
A missile range is essentially a large-scale job-shop, involving prodigious amounts of test equipment and formidable problems of coordination. Because of its characteristics, a missile range can be considered a one-machine, N-job situation. For this problem the major scheduling methods were examined and tested, but all were discarded as unfeasible except one which used a Monte Carlo scheduling procedure adjoined with statistical stopping rules. An algorithm based on these ideas was constructed, and it has proven to be flexible and workable, providing predictably near optimum schedules for the missile range within a probabilistic and statistical framework. Even though there is not total implementation yet, considerable benefits to the missile range have already been experienced. For example, in order to automate the scheduling process, it was necessary to codify the goals of the range, something that had never been formalized before. This involved an unprecedented scrutiny and precision of definition of these goals, and from this study a reasonable numerical optimization criterion was constructed. Also, in order to use any automated scheduling algorithm, data files had to be stored on tape for ready access, which in turn has improved other operations on the missile range that were dependent on these data. Furthermore, the scheduling algorithm is providing conflict-free schedules in a few minutes of computer time.  相似文献   

2.
In this paper, we present our novel algorithm, SOR (Shrinking Overlapped Range), for the minimum energy multicasting in wireless ad hoc networks. The heuristics in the literature have not considered changing the intermediate tree structure and this may result in worse performance even after local improvements at the end. In SOR, we extensively change the intermediate tree structure to maintain tighter structure in terms of energy consumption. We do so by shrinking the overlapped transmission range following the idea of WMA (wireless multicast advantage) property and by allowing the selection of internal transmissions which further changes the tree structure. Both theoretical analysis and experimental results show SOR outperforms other heuristics in the literature. Research is partially supported by NSF and Air Force grants.  相似文献   

3.
Security of infrastructure is a major concern. Traditional security schedules are unable to provide omnipresent coverage; consequently, adversaries can exploit predictable vulnerabilities to their advantage. Randomized security schedules, which randomly deploy security measures, overcome these limitations, but public perceptions of such schedules have not been examined. In this experiment, participants were asked to make a choice between attending a venue that employed a traditional (i.e., search everyone) or a random (i.e., a probability of being searched) security schedule. The absolute probability of detecting contraband was manipulated (i.e., 1/10, 1/4, 1/2) but equivalent between the two schedule types. In general, participants were indifferent to either security schedule, regardless of the probability of detection. The randomized schedule was deemed more convenient, but the traditional schedule was considered fairer and safer. There were no differences between traditional and random schedule in terms of perceived effectiveness or deterrence. Policy implications for the implementation and utilization of randomized schedules are discussed.  相似文献   

4.
In wireless ad hoc networks where every device runs on its own battery, the energy consumption is critical to lengthen the network lifetime. The communication among devices in the network can be categorized as unicasting and multicasting (including broadcasting). For the case of unicasting, computing the energy optimal path between the two communicating nodes is polynomially solvable by computing the shortest path. But for the case of multicasting, shortest path or minimum spanning tree does not guarantee an energy optimal communication. In this paper, we present our novel approach, Optimistic Most Energy Gain (OMEGa) method, for the minimum energy multicasting in wireless ad hoc networks. OMEGa aims at maximum utilization of Wireless Multicast Advantage (WMA), which essentially means covering more nodes by using larger energy. Both theoretical and experimental analysis shows OMEGa method performs very well. Research is partially supported by NSF and Air Force grants.  相似文献   

5.
Risk Analysis of Terrorist Attacks   总被引:1,自引:0,他引:1  
A quantitative probabilistic/systems analysis model is described which is useful for allocating resources to safeguard valuable documents or materials in either a fixed-site facility or a moving convoy against an overt terrorist attack. The model is also useful for ranking the sensitive areas at a site according to their survivability of a given hypothesized terrorist attempt. To compare various defense strategies and security configurations, the probability of a successful terrorist activity is computed based on event tree models of the site/security configuration. This calculation incorporates a realistic engagement model (in the event a guard force engages the terrorists prior to completion of their objective) and information on barrier penetration times (for example, distribution of the time to defeat a chain link fence or vault door, traverse an open area, and so forth). Two security analyses are described to illustrate the methodology. One example considers a terrorist attack on a convoy transporting a missile from a storage to a launch facility. The second example involves an attack on a munitions storage facility.  相似文献   

6.
A dynamic modeling approach to management of multiechelon, multi-indenture inventory systems with repair is addressed. The structure of the model follows that of the U.S. Air Force Reparable Asset Management System. The model is used as a vehicle to discuss the structure of typical multiechelon systems and to illustrate the advantages of a dynamic modeling approach to such systems.  相似文献   

7.
Firms are increasingly outsourcing information security operations to managed security service providers (MSSPs). Cost reduction and quality (security) improvement are often mentioned as motives for outsourcing information security, and these are also the frequently cited reasons for outsourcing traditional information technology (IT) functions, such as software development and maintenance. In this study, we present a different explanation—one based on interdependent risks and competitive externalities associated with IT security—for firms' decisions to outsource security. We show that in the absence of competitive externalities and interdependent risks, a firm will outsource security if and only if the MSSP offers a quality advantage over in‐house operations, which is consistent with the conventional explanation for security outsourcing. However, when security risks are interdependent and breaches impose competitive externalities, although firms still have stronger incentive to outsource security if the MSSP offers a higher quality in terms of preventing breaches than in‐house management, a quality advantage of MSSP over in‐house management is neither a prerequisite for a firm to outsource security nor a guarantee that a firm will. In addition to MSSP quality, the type of externality (positive or negative), the degree of externality, whether outsourcing increases or decreases risk interdependency, and the breach characteristics determine firms' sourcing decisions. When security breaches impose a positive externality, the incentive to outsource is enhanced if the MSSP decreases the risk interdependency and diminished if the MSSP increases this interdependency. A negative externality has the opposite effect on firms' incentives to outsource. A high demand spillover to a competitor, together with a high loss in industry demand because of a security breach, enhances these incentives to outsource security operations when the externality is negative. Finally, we extend our base model in several dimensions and show that our main results regarding the impact of interdependent risks and competitive externalities on sourcing decisions are robust and generalizable to different specifications.  相似文献   

8.
In response to increasing costs and reductions in manpower, the Tactical Air Command (TAC) of the USAF experimented with a specialized productivity measurement model known as data envelopment analysis (DEA). A medium-sized application of DEA was employed by TAC to evaluate the productivity of its seventeen subordinate vehicle maintenance sections over a four-year period. The application reports gains in productivity and the reactions of the field managers to the use of DEA.  相似文献   

9.
We evaluate, for the U.S. case, the costs and benefits of three security measures designed to reduce the likelihood of a direct replication of the 9/11 terrorist attacks. To do so, we assess risk reduction, losses, and security costs in the context of the full set of security layers. The three measures evaluated are installed physical secondary barriers (IPSB) to restrict access to the hardened cockpit door during door transitions, the Federal Air Marshal Service (FAMS), and the Federal Flight Deck Officer (FFDO) Program. In the process, we examine an alternate policy measure: doubling the budget of the FFDO program to $44 million per year, installing IPSBs in all U.S. aircraft at a cost of $13.5 million per year, and reducing funding for FAMS by 75% to $300 million per year. A break‐even cost‐benefit analysis then finds the minimum probability of an otherwise successful attack required for the benefit of each security measures to equal its cost. We find that the IPSB is costeffective if the annual attack probability of an otherwise successful attack exceeds 0.5% or one attack every 200 years. The FFDO program is costeffective if the annual attack probability exceeds 2%. On the other hand, more than two otherwise successful attacks per year are required for FAMS to be costeffective. A policy that includes IPSBs, an increased budget for FFDOs, and a reduced budget for FAMS may be a viable policy alternative, potentially saving hundreds of millions of dollars per year with consequences for security that are, at most, negligible.  相似文献   

10.
Customer service is a key aspect of restaurant success, as performance has shown a reliable positive relationship with customer retention. However, waitstaff performance may deteriorate, as income from gratuities is often unrelated to service quality. The present study investigated the effectiveness of an intervention consisting of task clarification and task-specific feedback on restaurant service tasks and observed the relationship between task completion and gratuities. Three adult women servers participated during their regular working shifts at a local dine-in restaurant. Initially customer service task completion was low (36% on average across participants). Performance increased immediately following the introduction of the intervention, and all participants maintained 87.5%–100% task completion. Correlational analyses found that gratuities were unrelated to performance and may thus pose a problem for performance maintenance. Implications relating to feedback and payment schedules are discussed.  相似文献   

11.
In this paper, we present three schemes to solve minimum total energy broadcasting problem in wireless ad hoc networks based on an efficient IP (integer programming) subproblem technique. Due to its NP-hardness, many heuristics have been studied. However, the heuristics in the literature suffer from coarse performance ratio. It is important to have knowledge of the optimal solution structure in order to develop more efficient heuristics and algorithms. We present one IP formulation and two iterative algorithms which make use of relaxed IP's to solve subproblems. The computational results show that our approaches outperform other techniques in the literature. Research is partially supported by NSF and Air Force grants.  相似文献   

12.
This article describes a methodology for risk-informed benefit–cost analyses of homeland security research products. The methodology is field-tested with 10 research products developed for the U.S. Coast Guard. Risk-informed benefit–cost analysis is a tool for risk management that integrates elements of risk analysis, decision analysis, and benefit–cost analysis. The cost analysis methodology includes a full-cost accounting of research projects, starting with initial fundamental research costs and extending to the costs of implementation of the research products and, where applicable, training, maintenance, and upgrade costs. The benefits analysis methodology is driven by changes in costs and risks leading to five alternative models: cost savings at the same level of security, increased security at the same cost, signal detection improvements, risk reduction by deterrence, and value of information. The U.S. Coast Guard staff selected 10 research projects to test and generalize the methodology. Examples include tools to improve the detection of explosives, reduce the costs of harbor patrols, and provide better predictions of hurricane wind speeds and floods. Benefits models and estimates varied by research project and many input parameters of the benefit estimates were highly uncertain, so risk analysis for sensitivity testing and simulation was important. Aggregating across the 10 research products, we found an overall median net present value of about $385 million, with a range from $54 million (5th percentile) to $877 million (95th percentile). Lessons learned are provided for future applications.  相似文献   

13.
We take cohorts of entering freshmen at the United States Air Force Academy and assign half to peer groups designed to maximize the academic performance of the lowest ability students. Our assignment algorithm uses nonlinear peer effects estimates from the historical pre‐treatment data, in which students were randomly assigned to peer groups. We find a negative and significant treatment effect for the students we intended to help. We provide evidence that within our “optimally” designed peer groups, students avoided the peers with whom we intended them to interact and instead formed more homogeneous subgroups. These results illustrate how policies that manipulate peer groups for a desired social outcome can be confounded by changes in the endogenous patterns of social interactions within the group.  相似文献   

14.
Intermittent demand is characterized by occasional demand arrivals interspersed by time intervals during which no demand occurs. These demand patterns pose considerable difficulties in terms of forecasting and stock control due to their compound nature, which implies variability both in terms of demand arrivals and demand sizes. An intuitively appealing strategy to deal with such patterns from a forecasting and stock control perspective is to aggregate demand in lower-frequency ‘time buckets’, thereby reducing the presence of zero observations. In this paper, we investigate the impact of forecasting aggregation on the stock control performance of intermittent demand patterns. The benefit of the forecasting aggregation approach is empirically assessed by means of analysis on a large demand dataset from the Royal Air Force (UK). The results show that the aggregation forecasting approach results in higher achieved service levels as compared to the classical forecasting approach. Moreover, when the combined service-cost performance is considered, the results also show that the former approach is more efficient than the latter, especially for high target service levels.  相似文献   

15.
We consider scheduling issues at Beyçelik, a Turkish automotive stamping company that uses presses to give shape to metal sheets in order to produce auto parts. The problem concerns the minimization of the total completion time of job orders (i.e., makespan) during a planning horizon. This problem may be classified as a combined generalized flowshop and flexible flowshop problem with special characteristics. We show that the Stamping Scheduling Problem is NP‐Hard. We develop an integer programming‐based method to build realistic and usable schedules. Our results show that the proposed method is able to find higher quality schedules (i.e., shorter makespan values) than both the company's current process and a model from the literature. However, the proposed method has a relatively long run time, which is not practical for the company in situations when a (new) schedule is needed quickly (e.g., when there is a machine breakdown or a rush order). To improve the solution time, we develop a second method that is inspired by decomposition. We show that the second method provides higher‐quality solutions—and in most cases optimal solutions—in a shorter time. We compare the performance of all three methods with the company's schedules. The second method finds a solution in minutes compared to Beyçelik's current process, which takes 28 hours. Further, the makespan values of the second method are about 6.1% shorter than the company's schedules. We estimate that the company can save over €187,000 annually by using the second method. We believe that the models and methods developed in this study can be used in similar companies and industries.  相似文献   

16.
We present polynomial-time algorithms for single machine problems with generalized positional deterioration effects and machine maintenance. The decisions should be taken regarding possible sequences of jobs and on the number of maintenance activities to be included into a schedule in order to minimize the overall makespan. We deal with general non-decreasing functions to represent deterioration rates of job processing times. Another novel extension of existing models is our assumption that a maintenance activity does not necessarily fully restore the machine to its original perfect state. In the resulting schedules, the jobs are split into groups, a particular group to be sequenced after a particular maintenance period, and the actual processing time of a job is affected by the group that job is placed into and its position within the group.  相似文献   

17.
The study shows that customers’ perception of overall quality in services centers is driven by two components: the service encounter quality and the quality of the servicecape. Thus, customers’ overall evaluation of service quality in a services center is a function of n service encounter qualities (k?=?1…n) and the quality of the servicecape. Furthermore, it is of great importance for the quality management of a services center that there could be (1) direct effects between the quality dimensions and the overall service quality as well as (2) direct effects between the quality dimensions themselves. The above mentioned study had been carried out exemplarily at the EuroAirport Basel-Mulhouse-Freiburg. The survey results suggest that servicecape quality has a significant effect both on the overall service quality and on the service encounter qualities. Moreover, the study shows that key quality indicators of the service center apply to the perception of security as well as to the service environment (for instance, arrangement of check-in/security desk, comfort of seats, ambience of waiting area).  相似文献   

18.
Two issues are examined concerning the evaluation of job-shop schedules. First, machine and labor cost/utilization are suggested as alternatives to physical utilization criteria. Secondly, an exploration is made of the use of Kiviat Charts as a means of integrating multidimensional criteria for the overall evaluation of the quality of job shop schedules.  相似文献   

19.
Air and cruise missile defense of the U.S. homeland is characterized by a requirement to protect a large number of critical assets nonuniformly dispersed over a vast area with relatively few defensive systems. In this article, we explore strategy alternatives to make the best use of existing defense resources and suggest this approach as a means of reducing risk while mitigating the cost of developing and acquiring new systems. We frame the issue as an attacker‐defender problem with simultaneous moves. First, we outline and examine the relatively simple problem of defending comparatively few locations with two surveillance systems. Second, we present our analysis and findings for a more realistic scenario that includes a representative list of U.S. critical assets. Third, we investigate sensitivity to defensive strategic choices in the more realistic scenario. As part of this investigation, we describe two complementary computational methods that, under certain circumstances, allow one to reduce large computational problems to a more manageable size. Finally, we demonstrate that strategic choices can be an important supplement to material solutions and can, in some cases, be a more cost‐effective alternative.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号