首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Efficient algorithms for finding a longest common increasing subsequence   总被引:1,自引:1,他引:0  
We study the problem of finding a longest common increasing subsequence (LCIS) of multiple sequences of numbers. The LCIS problem is a fundamental issue in various application areas, including the whole genome alignment. In this paper we give an efficient algorithm to find the LCIS of two sequences in time where n is the length of each sequence andr is the number of ordered pairs of positions at which the two sequences match, ℓ is the length of the LCIS, and Sort(n) is the time to sort n numbers. For m sequences wherem ≥ 3, we find the LCIS in Sort(n)) time where r is the total number of m-tuples of positions at which the m sequences match. The previous results find the LCIS of two sequences in O(n 2) and Sort(n)) time. Our algorithm is faster when r is relatively small, e.g., for .  相似文献   

2.
We study the problem of processing supergraph queries on graph databases. A graph database D is a large set of graphs. A supergraph query q on D is to retrieve all the graphs in D such that q is a supergraph of them. The large number of graphs in databases and the NP-completeness of subgraph isomorphism testing make it challenging to efficiently processing supergraph queries. In this paper, a new approach to processing supergraph queries is proposed. Specifically, a method for compactly organizing graph databases is first presented. Common subgraphs of the graphs in a database are stored only once in the compact organization of the database, in order to reduce the overall cost of subgraph isomorphism testings from the stored graphs to queries during query processing. Then, an exact algorithm and an approximate algorithm for generating the significant feature set with optimal order are proposed, followed by the algorithms for indices construction on graph databases. The optimal order on the feature set is to reduce the number of subgraph isomorphism testings during query processing. Based on the compact organization of graph databases, a novel algorithm for testing subgraph isomorphisms from multiple graphs to one graph is presented. Finally, based on all the above techniques, a query processing method is proposed. Analytical and experimental results show that the proposed algorithms outperform the existing similar algorithms by one to two orders of magnitude.  相似文献   

3.
In this paper, we revisit a recent variant of the longest common subsequence (LCS) problem, the string-excluding constrained LCS (STR-EC-LCS) problem, which was first addressed by Chen and Chao (J Comb Optim 21(3):383–392, 2011). Given two sequences \(X\) and \(Y\) of lengths \(m\) and \(n,\) respectively, and a constraint string \(P\) of length \(r,\) we are to find a common subsequence \(Z\) of \(X\) and \(Y\) which excludes \(P\) as a substring and the length of \(Z\) is maximized. In fact, this problem cannot be correctly solved by the previously proposed algorithm. Thus, we give a correct algorithm with \(O(mnr)\) time to solve it. Then, we revisit the STR-EC-LCS problem with multiple constraints \(\{ P_1, P_2, \ldots , P_k \}.\) We propose a polynomial-time algorithm which runs in \(O(mnR)\) time, where \(R = \sum _{i=1}^{k} |P_i|,\) and thus it overthrows the previous claim of NP-hardness.  相似文献   

4.
We study efficient algorithms for establishing reliable connections with bandwidth guarantees in communication networks. In the normal mode of operation, each connection uses a primary path to deliver packets from the source to the destination. To ensure continuous operation in the event of an edge failure, each connection uses a set of backup bridges, each bridge protecting a portion of the primary path. To meet the bandwidth requirement of the connection, a certain amount of bandwidth must be allocated the edges of the primary path, as well as on the backup edges. In this paper, we focus on minimizing the amount of required backup allocation by sharing backup bandwidth among different connections. We consider efficient sharing schemes that require only partial information about the current state of the network. Specifically, the only information available for each edge is the total amount of primary allocation and the cost of allocating backup bandwidth on this edge. We consider the problem of finding a minimum cost backup allocation together with a set of bridges for a given primary path. We prove that this problem is NP-hard and present an approximation algorithm whose performance is within of the optimum, where n is the number of edges in the primary path. We also consider the problem of finding both a primary path and backup allocation of minimal total cost. A preliminary version of this paper appears in the Proceedings of 13th Annual European Symposium on Algorithms - ESA 2005, Mallorca, Spain. J. (Seffi) Naor: This research is supported in part by a foundational and strategical research grant from the Israeli Ministry of Science, and by a US-Israel BSF Grant 2002276.  相似文献   

5.
6.
Let \(G=(V,\, E)\) be a given directed graph in which every edge e is associated with two nonnegative costs: a weight w(e) and a length l(e). For a pair of specified distinct vertices \(s,\, t\in V\), the k-(edge) disjoint constrained shortest path (kCSP) problem is to compute k (edge) disjoint paths between s and t, such that the total length of the paths is minimized and the weight is bounded by a given weight budget \(W\in \mathbb {R}_{0}^{+}\). The problem is known to be \({\mathcal {NP}}\)-hard, even when \(k=1\) (Garey and Johnson in Computers and intractability, 1979). Approximation algorithms with bifactor ratio \(\left( 1\,+\,\frac{1}{r},\, r\left( 1\,+\,\frac{2(\log r\,+\,1)}{r}\right) (1\,+\,\epsilon )\right) \) and \((1\,+\,\frac{1}{r},\,1\,+\,r)\) have been developed for \(k=2\) in Orda and Sprintson (IEEE INFOCOM, pp. 727–738, 2004) and Chao and Hong (IEICE Trans Inf Syst 90(2):465–472, 2007), respectively. For general k, an approximation algorithm with ratio \((1,\, O(\ln n))\) has been developed for a weaker version of kCSP, the k bi-constraint path problem which is to compute k disjoint st-paths satisfying a given length constraint and a weight constraint simultaneously (Guo et al. in COCOON, pp. 325–336, 2013). This paper first gives an approximation algorithm with bifactor ratio \((2,\,2)\) for kCSP using the LP-rounding technique. The algorithm is then improved by adopting a more sophisticated method to round edges. It is shown that for any solution output by the improved algorithm, there exists a real number \(0\le \alpha \le 2\) such that the weight and the length of the solution are bounded by \(\alpha \) times and \(2-\alpha \) times of that of an optimum solution, respectively. The key observation of the ratio proof is to show that the fractional edges, in a basic solution against the proposed linear relaxation of kCSP, exactly compose a graph in which the degree of every vertex is exactly two. At last, by a novel enhancement of the technique in Guo et al. (COCOON, pp. 325–336, 2013), the approximation ratio is further improved to \((1,\,\ln n)\).  相似文献   

7.
The syntenic distance between two genomes is the minimum number of fusions, fissions, and translocations that can transform one genome to the other, ignoring the gene order within chromosomes. As the problem is NP-hard in general, some particular classes of synteny instances, such as linear synteny, exact synteny and nested synteny, are examined in the literature. In this paper, we propose a new special class of synteny instances, called uncovering synteny. We first present a polynomial time algorithm to solve the connected case of uncovering synteny optimally. By performing only intra-component moves, we then solve the unconnected case of uncovering synteny. We will further calculate the diameters of connected and unconnected uncovering synteny, respectively.  相似文献   

8.
Greedy algorithms are simple, but their relative power is not well understood. The priority framework (Borodin et al. in Algorithmica 37:295–326, 2003) captures a key notion of “greediness” in the sense that it processes (in some locally optimal manner) one data item at a time, depending on and only on the current knowledge of the input. This algorithmic model provides a tool to assess the computational power and limitations of greedy algorithms, especially in terms of their approximability. In this paper, we study priority algorithm approximation ratios for the Subset-Sum Problem, focusing on the power of revocable decisions, for which the accepted data items can be later rejected to maintain the feasibility of the solution. We first provide a tight bound of α≈0.657 for irrevocable priority algorithms. We then show that the approximation ratio of fixed order revocable priority algorithms is between β≈0.780 and γ≈0.852, and the ratio of adaptive order revocable priority algorithms is between 0.8 and δ≈0.893. A preliminary version of this paper appeared in the Proceedings of COCOON 2007, LNCS 4598, pp. 504–514.  相似文献   

9.
An edge-colored graph G is rainbow connected if any two vertices are connected by a path whose edges have distinct colors. The rainbow connection of a connected graph G, denoted rc(G), is the smallest number of colors that are needed in order to make G rainbow connected. In the first result of this paper we prove that computing rc(G) is NP-Hard solving an open problem from Caro et al. (Electron. J. Comb. 15, 2008, Paper R57). In fact, we prove that it is already NP-Complete to decide if rc(G)=2, and also that it is NP-Complete to decide whether a given edge-colored (with an unbounded number of colors) graph is rainbow connected. On the positive side, we prove that for every ε>0, a connected graph with minimum degree at least ε n has bounded rainbow connection, where the bound depends only on ε, and a corresponding coloring can be constructed in polynomial time. Additional non-trivial upper bounds, as well as open problems and conjectures are also presented.  相似文献   

10.
Competent genetic algorithms for weighing matrices   总被引:1,自引:1,他引:0  
In this paper, we demonstrate that the search for weighing matrices constructed from two circulants can be viewed as a minimization problem together with two competent genetic algorithms to locate optima of an objective function. The motivation to deal with the messy genetic algorithm (mGA) is given from the pioneering results of Goldberg, regarding the ability of the mGA to put tight genes together in a solution which points directly to structural patterns in weighing matrices. In order to take into advantage certain properties of two ternary sequences with zero autocorrelation we use an adaptation of the fast messy GA (fmGA) where we combine mGA with advanced techniques, such as thresholding and tie-breaking. This transformation of the weighing matrices problem to an instance of a combinatorial optimization problem seems to be promising, since we resolved two open cases for weighing matrices as these are listed in the second edition of the Handbook of Combinatorial Designs.  相似文献   

11.
The problem of interest is covering a given point set with homothetic copies of several convex containers C 1,…,C k , while the objective is to minimize the maximum over the dilatation factors. Such k-containment problems arise in various applications, e.g. in facility location, shape fitting, data classification or clustering. So far most attention has been paid to the special case of the Euclidean k-center problem, where all containers C i are Euclidean unit balls. Recent developments based on so-called core-sets enable not only better theoretical bounds in the running time of approximation algorithms but also improvements in practically solvable input sizes. Here, we present some new geometric inequalities and a Mixed-Integer-Convex-Programming formulation. Both are used in a very effective branch-and-bound routine which not only improves on best known running times in the Euclidean case but also handles general and even different containers among the C i .  相似文献   

12.
We present two polynomial-time approximation algorithms for the metric case of the maximum traveling salesman problem. One of them is for directed graphs and its approximation ratio is . The other is for undirected graphs and its approximation ratio is . Both algorithms improve on the previous bests. A preliminary version of this paper appeared in the Proceedings of 13th European Symposium on Algorithms (ESA2005), Lecture Notes in Computer Science, Vol. 3669, pp. 179–190, 2005.  相似文献   

13.
The maximum clique problem provides a classic framework for detecting cohesive subgraphs. However, this approach can fail to detect much of the cohesive structure in a graph. To address this issue, Seidman and Foster introduced k-plexes as a degree-based clique relaxation. More recently, Balasundaram et al. formulated the maximum k-plex problem as an integer program and designed a branch-and-cut algorithm. This paper derives a new upper bound on the cardinality of k-plexes and adapts combinatorial clique algorithms to find maximum k-plexes.  相似文献   

14.
The container pre-marshalling problem (CPMP) aims to rearrange containers in a bay with the least movement effort; thus, in the final layout, containers are piled according to a predetermined order. Previous researchers, without exception, assumed that all the stacks in a bay are functionally identical. Such a classical problem setting is reexamined in this paper. Moreover, a new problem, the CPMP with a dummy stack (CPMPDS) is proposed. At terminals with transfer lanes, a bay includes a row of ordinary stacks and a dummy stack. The dummy stack is actually the bay space that is reserved for trucks. Therefore, containers can be shipped out from the bay. During the pre-marshalling process, the dummy stack temporarily stores containers as an ordinary stack. However, the dummy stack must be emptied at the end of pre-marshalling. In this paper, target-guided algorithms are proposed to handle both the classical CPMP and new CPMPDS. All the proposed algorithms guarantee termination. Experimental results in terms of the CPMP show that the proposed algorithms surpass the state-of-the-art algorithm.  相似文献   

15.
We introduce canonical forms that represent certain equivalence classes of totally cyclic and acyclic orientations of graphs and present a polynomial algorithms for their constructions. The forms are used in new formulas evaluating tension and flow polynomials on graphs.  相似文献   

16.
We study the recently introduced Connected Feedback Vertex Set (CFVS) problem from the view-point of parameterized algorithms. CFVS is the connected variant of the classical Feedback Vertex Set problem and is defined as follows: given a graph G=(V,E) and an integer k, decide whether there exists F?V, |F|??k, such that G[V?F] is a forest and G[F] is connected. We show that Connected Feedback Vertex Set can be solved in time O(2 O(k) n O(1)) on general graphs and in time $O(2^{O(\sqrt{k}\log k)}n^{O(1)})$ on graphs excluding a fixed graph H as a minor. Our result on general undirected graphs uses, as a subroutine, a parameterized algorithm for Group Steiner Tree, a well studied variant of Steiner Tree. We find the algorithm for Group Steiner Tree of independent interest and believe that it could be useful for obtaining parameterized algorithms for other connectivity problems.  相似文献   

17.
We consider the bus evacuation problem. Given a positive integer B, a bipartite graph G with parts S and \(T \cup \{r\}\) in a metric space and functions \(l_i :S \rightarrow {\mathbb {Z}}_+\) and \({u_j :T \rightarrow \mathbb {Z}_+ \cup \{\infty \}}\), one wishes to find a set of B walks in G. Every walk in B should start at r and finish in T and r must be visited only once. Also, among all walks, each vertex i of S must be visited at least \(l_i\) times and each vertex j of T must be visited at most \(u_j\) times. The objective is to find a solution that minimizes the length of the longest walk. This problem arises in emergency planning situations where the walks correspond to the routes of B buses that must transport each group of people in S to a shelter in T, and the objective is to evacuate the entire population in the minimum amount of time. In this paper, we prove that approximating this problem by less than a constant is \(\text{ NP }\)-hard and present a 10.2-approximation algorithm. Further, for the uncapacitated BEP, in which \(u_j\) is infinity for each j, we give a 4.2-approximation algorithm.  相似文献   

18.
Approximation algorithms for connected facility location problems   总被引:1,自引:1,他引:0  
We study Connected Facility Location problems. We are given a connected graph G=(V,E) with nonnegative edge cost c e for each edge eE, a set of clients DV such that each client jD has positive demand d j and a set of facilities FV each has nonnegative opening cost f i and capacity to serve all client demands. The objective is to open a subset of facilities, say , to assign each client jD to exactly one open facility i(j) and to connect all open facilities by a Steiner tree T such that the cost is minimized for a given input parameter M≥1. We propose a LP-rounding based 8.29 approximation algorithm which improves the previous bound 8.55 (Swamy and Kumar in Algorithmica, 40:245–269, 2004). We also consider the problem when opening cost of all facilities are equal. In this case we give a 7.0 approximation algorithm.  相似文献   

19.
20.
Efficient Algorithms for Similarity Search   总被引:1,自引:0,他引:1  
The problem of our interest takes as input a database of m sequences from an alphabet and an integer k. The goal is to report all the pairs of sequences that have a matching subsequence of length at least k. We employ two algorithms to solve this problem. The first algorithm is based on sorting and the second is based on generalized suffix trees. We provide experimental data comparing the performances of these algorithms. The generalized suffix tree based algorithm performs better than the sorting based algorithm.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号