首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Breakpoint graph decomposition is a crucial step in all recent approximation algorithms for SORTING BY REVERSALS, which is one of the best-known algorithmic problems in computational molecular biology. Caprara and Rizzi recently improved the approximation ratio for breakpoint graph decomposition from to + 1.4348 + , for any positive . In this paper, we extend the techniques of Caprara and Rizzi and incorporate a balancing argument to further improve the approximation ratio to + 1.4193 + , for any positive . These improvements imply improved approximation results for SORTING BY REVERSALS for almost all random permutations.  相似文献   

2.
Context in the Risk Assessment of Digital Systems   总被引:1,自引:0,他引:1  
As the use of digital computers for instrumentation and control of safety-critical systems has increased, there has been a growing debate over the issue of whether probabilistic risk assessment techniques can be applied to these systems. This debate has centered on the issue of whether software failures can be modeled probabilistically. This paper describes a context-based approach to software risk assessment that explicitly recognizes the fact that the behavior of software is not probabilistic. The source of the perceived uncertainty in its behavior results from both the input to the software as well as the application and environment in which the software is operating. Failures occur as the result of encountering some context for which the software was not properly designed, as opposed to the software simply failing randomly. The paper elaborates on the concept of error-forcing context as it applies to software. It also illustrates a methodology which utilizes event trees, fault trees, and the Dynamic Flowgraph Methodology (DFM) to identify error-forcing contexts for software in the form of fault tree prime implicants.  相似文献   

3.
This study uses a sample comprised of U.S. students and Iraqi students to determine if differences occur over ethical perceptions based on cultural/demographic issues. Irrespective of demographics, the results of this study indicate significant cultural differences between Iraqi students and American students with regard to selected ethical issues concerning graduate education. Specifically the differences occurred in the students' perceptions of winning is everything, selling one's soul, logic before emotion, and pander to professors. Iraqi students consistently viewed these beliefs as more necessary for success in their graduate education than did their American counterparts.  相似文献   

4.
Given a set of points P in a metric space, let l(P) denote the ratio of lengths between the shortest k-edge-connected Steiner network and the shortest k-edge-connected spanning network on P, and let r = inf l(P) P for k 1. In this paper, we show that in any metric space, r 3/4 for k 2, and there exists a polynomial-time -approximation for the shortest k-edge-connected Steiner network, where = 2 for even k and = 2 + 4/(3k) for odd k. In the Euclidean plane, and .  相似文献   

5.
Hattis  Dale  Banati  Prerna  Goble  Rob  Burmaster  David E. 《Risk analysis》1999,19(4):711-726
This paper reviews existing data on the variability in parameters relevant for health risk analyses. We cover both exposure-related parameters and parameters related to individual susceptibility to toxicity. The toxicity/susceptibility data base under construction is part of a longer term research effort to lay the groundwork for quantitative distributional analyses of non-cancer toxic risks. These data are broken down into a variety of parameter types that encompass different portions of the pathway from external exposure to the production of biological responses. The discrete steps in this pathway, as we now conceive them, are:Contact Rate (Breathing rates per body weight; fish consumption per body weight)Uptake or Absorption as a Fraction of Intake or Contact RateGeneral Systemic Availability Net of First Pass Elimination and Dilution via Distribution Volume (e.g., initial blood concentration per mg/kg of uptake)Systemic Elimination (half life or clearance)Active Site Concentration per Systemic Blood or Plasma ConcentrationPhysiological Parameter Change per Active Site Concentration (expressed as the dose required to make a given percentage change in different people, or the dose required to achieve some proportion of an individual's maximum response to the drug or toxicant)Functional Reserve Capacity–Change in Baseline Physiological Parameter Needed to Produce a Biological Response or Pass a Criterion of Abnormal FunctionComparison of the amounts of variability observed for the different parameter types suggests that appreciable variability is associated with the final step in the process–differences among people in functional reserve capacity. This has the implication that relevant information for estimating effective toxic susceptibility distributions may be gleaned by direct studies of the population distributions of key physiological parameters in people that are not exposed to the environmental and occupational toxicants that are thought to perturb those parameters. This is illustrated with some recent observations of the population distributions of Low Density Lipoprotein Cholesterol from the second and third National Health and Nutrition Examination Surveys.  相似文献   

6.
Putzrath  Resha M.  Wilson  James D. 《Risk analysis》1999,19(2):231-247
We investigated the way results of human health risk assessments are used, and the theory used to describe those methods, sometimes called the NAS paradigm. Contrary to a key tenet of that theory, current methods have strictly limited utility. The characterizations now considered standard, Safety Indices such as Acceptable Daily Intake, Reference Dose, and so on, usefully inform only decisions that require a choice between two policy alternatives (e.g., approve a food additive or not), decided solely on the basis of a finding of safety. Risk is characterized as the quotient of one of these Safety Indices divided by an estimate of exposure: a quotient greater than one implies that the situation may be considered safe. Such decisions are very widespread, both in the U. S. federal government and elsewhere. No current method is universal; different policies lead to different practices, for example, in California's Proposition 65, where statutory provisions specify some practices. Further, an important kind of human health risk assessment is not recognized by this theory: this kind characterizes risk as likelihood of harm, given estimates of exposure consequent to various decision choices. Likelihood estimates are necessary whenever decision makers have many possible decision choices and must weigh more than two societal values, such as in EPA's implementation of conventional air pollutants. These estimates can not be derived using current methods; different methods are needed. Our analysis suggests changes needed in both the theory and practice of human health risk assessment, and how what is done is depicted.  相似文献   

7.
The author suggests a Weberian methodology, based on theories of democracy and organization, for assessing normative implications of public organizations. How different organizational models contribute to (re)create democracy and legitimacy is scrutinized with reference to a Swedish IT program. The conclusion is that a system management organization will be an appropriate choice for dealing with tame problems, but it will at the same time promote an elitist democratization. In contrast, a development organization will be more appropriate in dealing with complex problems, and it will most likely promote discursive democratization.  相似文献   

8.
9.
We present a few comments on the paper Attacking the market split problem with lattice point enumeration by A. Wasserman, published in Journal of Combinatorial Optimization, vol. 6, pp. 5–16, 2002.  相似文献   

10.
Center and Distinguisher for Strings with Unbounded Alphabet   总被引:2,自引:0,他引:2  
Consider two sets and of strings of length L with characters from an unbounded alphabet , i.e., the size of is not bounded by a constant and has to be taken into consideration as a parameter for input size. A closest string s* of is a string that minimizes the maximum of Hamming1 distance(s, s*) over all string s : s . In contrast, a farthest string t* from maximizes the minimum of Hamming distance(t*,t) over all elements t: t . A distinguisher of from is a string that is close to every string in and far away from any string in . We obtain polynomial time approximation schemes to settle the above problems.  相似文献   

11.
We consider the problem of approximating the global minimum of a general quadratic program (QP) with n variables subject to m ellipsoidal constraints. For m=1, we rigorously show that an -minimizer, where error (0, 1), can be obtained in polynomial time, meaning that the number of arithmetic operations is a polynomial in n, m, and log(1/). For m 2, we present a polynomial-time (1- )-approximation algorithm as well as a semidefinite programming relaxation for this problem. In addition, we present approximation algorithms for solving QP under the box constraints and the assignment polytope constraints.  相似文献   

12.
This article employs an institutional perspective in formulating predictions about the ethical futures of privatization partnerships. Although this paper focuses on ethical concerns in the U.S. public sector, it incorporates a multinational dimension in (a) comparing the meaning of privatization among societies and (b) probing privatization financing in the global economy. Five assumptions that flow from institutional reasoning are made explicit as supports for subsequent predictions. The institutional logic shifts privatization conversation away from conventional debate about competition and efficiency toward centralizing forces in both sectors in response to globalization. In that regard, this study identifies the systemic erosion of (local) community integrity as the key privatization problem of the future.  相似文献   

13.
For a multigraph G = (V, E), let s V be a designated vertex which has an even degree, and let G (V – s) denote min{c G(X) | Ø X V – s}, where c G(X) denotes the size of cut X. Splitting two adjacent edges (s, u) and (s, v) means deleting these edges and adding a new edge (u, v). For an integer k, splitting two edges e 1 and e 2 incident to s is called (k, s)-feasible if G(V – s) k holds in the resulting graph G. In this paper, we prove that, for a planar graph G and an even k or k = 3 with k G (V – s), there exists a complete (k, s)-feasible splitting at s such that the resulting graph G is still planar, and present an O(n 3 log n) time algorithm for finding such a splitting, where n = |V|. However, for every odd k 5, there is a planar graph G with a vertex s which has no complete (k, s)-feasible and planarity-preserving splitting. As an application of this result, we show that for an outerplanar graph G and an even integer k the problem of optimally augmenting G to a k-edge-connected planar graph can be solved in O(n 3 log n) time.  相似文献   

14.
For a Boolean function given by a Boolean formula (or a binary circuit) S we discuss the problem of building a Boolean formula (binary circuit) of minimal size, which computes the function g equivalent to , or -equivalent to , i.e., . In this paper we prove that if P NP then this problem can not be approximated with a good approximation ratio by a polynomial time algorithm.  相似文献   

15.
Hammitt  James K.  Belsky  Eric S.  Levy  Jonathan I.  Graham  John D. 《Risk analysis》1999,19(6):1037-1058
Residential building codes intended to promote health and safety may produce unintended countervailing risks by adding to the cost of construction. Higher construction costs increase the price of new homes and may increase health and safety risks through income and stock effects. The income effect arises because households that purchase a new home have less income remaining for spending on other goods that contribute to health and safety. The stock effect arises because suppression of new-home construction leads to slower replacement of less safe housing units. These countervailing risks are not presently considered in code debates. We demonstrate the feasibility of estimating the approximate magnitude of countervailing risks by combining the income effect with three relatively well understood and significant home-health risks. We estimate that a code change that increases the nationwide cost of constructing and maintaining homes by $150 (0.1% of the average cost to build a single-family home) would induce offsetting risks yielding between 2 and 60 premature fatalities or, including morbidity effects, between 20 and 800 lost quality-adjusted life years (both discounted at 3%) each year the code provision remains in effect. To provide a net health benefit, the code change would need to reduce risk by at least this amount. Future research should refine these estimates, incorporate quantitative uncertainty analysis, and apply a full risk-tradeoff approach to real-world case studies of proposed code changes.  相似文献   

16.
This paper first reviews the literature on the role of codes of conduct for organisations in Hong Kong in their attempts to manage increasingly complex ethical problems and issues. It shows that, although valuable foundations exist upon which to build, research and understanding of the subject is at it's embryonic stage. Social Psychology literature is examined to investigate what lessons those concerned with the study of ethics may learn and the work of Hofstede, as seminal in the area of work-related values, is emphasised in this context.Following Hofstede's proposals for strategies for operationalizing1 constructs about human values, a content analysis2 was conducted on a pilot sample of codes provided by Hong Kong organisations. The results show three clearly identified clusters of organisations with common formats. The first group, described as Foreign Legal, emphasises legal compliance, has criteria for invoking penalties and consists of foreign-owned, large multinational organisations. Companies in the second cluster have codes which, except in the case of a couple of larger organisations, mainly follow the Independent Commission Against Corruption's (ICAC) standard format. The third cluster, described as the Bank Network, also appear to largely conform to a format: the Hong Kong Banking Association's guidelines.Further analysis conducted here of the Hong Kong codes indicates the important role of emic teleological values3, such as trust and reputation, amongst indigenous organisations, rather than the amorality suggested by an earlier study (Dolecheck and Bethke, 1990). These results support the proposition that Hong Kong ethical perspectives are culture bound4, as there appear to be different emphases than revealed in an American study (Stevens, 1994), which identified an emphasis in the US codes upon introverted organisational issues and a failure to espouse deontological values5.The conclusion is that designing a research programme on business values in Hong Kong requires reference to studies of values in cross-cultural psychology generally and to Hofstede's work in particular. It also supports the need for indigenous research and models in this field which avoid the ethnocentrism inherent in much Western theory and research.  相似文献   

17.
The risk of catastrophic failures, for example in the aviation and aerospace industries, can be approached from different angles (e.g., statistics when they exist, or a detailed probabilistic analysis of the system). Each new accident carries information that has already been included in the experience base or constitutes new evidence that can be used to update a previous assessment of the risk. In this paper, we take a different approach and consider the risk and the updating from the investor's point of view. Based on the market response to past airplane accidents, we examine which ones have created a surprise response and which ones are considered part of the risk of the airline business as previously assessed. To do so, we quantify the magnitude and the timing of the observed market response to catastrophic accidents, and we compare it to an estimate of the response that would be expected based on the true actual cost of the accident including direct and indirect costs (full-cost information response). First, we develop a method based on stock market data to measure the actual market response to an accident and we construct an estimate of the full-cost information response to such an event. We then compare the two figures for the immediate and the long-term response of the market for the affected firm, as well as for the whole industry group to which the firm belongs. As an illustration, we analyze a sample of ten fatal accidents experienced by major US domestic airlines during the last seven years. In four cases, we observed an abnormal market response. In these instances, it seems that the shareholders may have updated their estimates of the probability of a future accident in the affected airlines or more generally of the firm's future business prospects. This market reaction is not always easy to explain much less to anticipate, a fact which management should bear in mind when planning a firm's response to such an event.  相似文献   

18.
We study one of the most basic online scheduling models, online one machine scheduling with delivery times where jobs arrive over time. We provide the first randomized algorithm for this model, show that it is 1.55370-competitive and show that this analysis is tight. The best possible deterministic algorithm is 1.61803-competitive. Our algorithm is a distribution between two deterministic algorithms. We show that any such algorithm is no better than 1.5-competitive. To our knowledge, this is the first lower bound proof for a distribution between two deterministic algorithms.  相似文献   

19.
This paper uses a Rokeach Value Survey methodology to again ask the question, now in the mid 1990s, whether business student ethics are different from non-business student ethics. Additionally, the paper addresses the question of whether a course can alter or change student ethics and values during a semester. Thirdly, this paper attempts to operationalize and empirically test and measure the new ethical concepts of moral management and moral maximization.  相似文献   

20.
The problems dealt with in this paper are generalizations of the set cover problem, min{cx | Ax b, x {0,1}n}, where c Q+n, A {0,1}m × n, b 1. The covering 0-1 integer program is the one, in this formulation, with arbitrary nonnegative entries of A and b, while the partial set cover problem requires only mK constrains (or more) in Ax b to be satisfied when integer K is additionall specified. While many approximation algorithms have been recently developed for these problems and their special cases, using computationally rather expensive (albeit polynomial) LP-rounding (or SDP-rounding), we present a more efficient purely combinatorial algorithm and investigate its approximation capability for them. It will be shown that, when compared with the best performance known today and obtained by rounding methods, although its performance comes short in some special cases, it is at least equally good in general, extends for partial vertex cover, and improves for weighted multicover, partial set cover, and further generalizations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号