首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 187 毫秒
1.
2.
A step toward a strategic foundation for rational expectations equilibrium is taken by considering a double auction with n buyers and m sellers with interdependent values and affiliated private information. If there are sufficiently many buyers and sellers, and their bids are restricted to a sufficiently fine discrete set of prices, then, generically, there is an equilibrium in nondecreasing bidding functions that is arbitrarily close to the unique fully revealing rational expectations equilibrium of the limit market with unrestricted bids and a continuum of agents. In particular, the large double‐auction equilibrium is almost efficient and almost fully aggregates the agents' information.  相似文献   

3.
We study the optimal tree structure for the key management problem. In the key tree, when two or more leaves are deleted or replaced, the updating cost is defined to be the number of encryptions needed to securely update the remaining keys. The objective is to find the optimal tree structure where the worst case updating cost is minimum. We extend the result of 1-replacement problem in Snoeyink et al. (Proceedings of the twentieth annual IEEE conference on computer communications, pp. 422–431, 2001) to the k-user case. We first show that the k-deletion problem can be solved by handling k-replacement problem. Focusing on the k-replacement problem, we show that the optimal tree can be computed in $O(n^{(k+1)^{2}})$ time given a constant k. Then we derive a tight degree bound for optimal tree when replacing two users. By investigating the maximum number of leaves that can be placed on the tree given a fixed worst case replacement cost, we prove that a balanced tree with certain root degree 5≤d≤7 where the number of leaves in the subtrees differs by at most one and each subtree is a 2-3 tree can always achieve the minimum worst case 2-replacement cost. Thus the optimal tree for two-user replacement problem can be efficiently constructed in O(n) time.  相似文献   

4.
本文研究了不完全信息采购环境下供应链的协调问题.拍卖不仅是一种价格确定机制,它也可以作为一种供应链协调的机制.文献已经证明拍卖机制对于参与方的收益及整条供应链的有效性具有显著影响.本文研究了当市场中存在n个供应商1个买者的情形下,拍卖环境满足独立私人值(IPV)条件、且市场反需求函数为对数函数时,批发价格拍卖、目录拍卖及二部合同拍卖为各方所产生的期望收入,并且证明了批发价格拍卖和目录拍卖不能实现渠道协调,而在拥有信息中介的二部合同拍卖机制下系统可以达到渠道协调.  相似文献   

5.
Jin Ho Choi  Yong Sik Chang  Ingoo Han   《Omega》2009,37(2):482-493
This study proposes a new and highly efficient dynamic combinatorial auction mechanism—the N-bilateral optimized combinatorial auction (N-BOCA). N-BOCA is a flexible iterative combinatorial auction model that offers more optimized trading for multiple suppliers and purchasers in the supply chain than one-sided combinatorial auction. We design the N-BOCA model from the perspectives of market architecture, trading rules, and decision strategy for winner determination, the decision strategy for winner determination needs flexible optimization modeling capability. Thus rule-based reasoning was applied for reflecting the flexible decision strategies. We also show the viability of N-BOCA through Paired Samples T-test experimentation. It shows that N-BOCA yields higher purchase efficiency and effectiveness than the one-auctioneer to multi-bidders (1-to-N) combinatorial auction mechanism.  相似文献   

6.
An improved approximation algorithm is presented in this paper for the multicast k-tree routing problem. The algorithm has a worst case performance ratio of (2.4 + ρ), where ρ is the best approximation ratio for the metric Steiner tree problem (and is about 1.55 so far). The previous best approximation algorithm for the multicast k-tree routing problem has a performance ratio of 4. Two techniques, weight averaging and tree partitioning, are developed to facilitate the algorithm design and analysis.Research supported by AICML, CFI, NSERC, PENCE, a Startup Grant from the University of Alberta, and NNSF Grant 60373012.  相似文献   

7.
This paper proposes a structural nonequilibrium model of initial responses to incomplete‐information games based on “level‐k” thinking, which describes behavior in many experiments with complete‐information games. We derive the model's implications in first‐ and second‐price auctions with general information structures, compare them to equilibrium and Eyster and Rabin's (2005) “cursed equilibrium,” and evaluate the model's potential to explain nonequilibrium bidding in auction experiments. The level‐k model generalizes many insights from equilibrium auction theory. It allows a unified explanation of the winner's curse in common‐value auctions and overbidding in those independent‐private‐value auctions without the uniform value distributions used in most experiments.  相似文献   

8.
Electronic reverse auctions are a commonly used procurement mechanism. Research to date has focused on suppliers who are ex ante symmetric in that their costs are drawn from a common distribution. However, in many cases, a seller's range of potential costs depends on their own operations, location, or economies of scale and scope. Thus, understanding how different bidder types impact auction outcomes is key when designing an auction. This study reports the results of the first controlled laboratory experiment designed to compare prices between first‐price and second‐price procurement auctions for homogeneous goods when seller cost types are asymmetric and the number of bidders varies. The results indicate that first‐price auctions generate lower prices regardless of market composition. The results also reveal that first‐price auctions are at least weakly more efficient than second‐price auctions despite the theoretical prediction that the reverse should hold in asymmetric auctions. Post hoc analysis of individual bidders' behavior in first‐price auctions revealed evidence that bidders systematically underbid when their cost realizations were close to the lower bound. Furthermore, bidders adjust their behavior based on the type of the other bidders in the market in a manner inconsistent with theory. Consequently, adding a third bidder to a two‐bidder market is not advantageous to the buyer unless that third bidder is a low‐cost type.  相似文献   

9.
Mechanism design enables a social planner to obtain a desired outcome by leveraging the players' rationality and their beliefs. It is thus a fundamental, but yet unproven, intuition that the higher the level of rationality of the players, the better the set of obtainable outcomes. In this paper, we prove this fundamental intuition for players with possibilistic beliefs, a model long considered in epistemic game theory. Specifically, • We define a sequence of monotonically increasing revenue benchmarks for single‐good auctions, G0G1G2≤⋯, where each Gi is defined over the players' beliefs and G0 is the second‐highest valuation (i.e., the revenue benchmark achieved by the second‐price mechanism). • We (1) construct a single, interim individually rational, auction mechanism that, without any clue about the rationality level of the players, guarantees revenue Gk if all players have rationality levels ≥k+1, and (2) prove that no such mechanism can guarantee revenue even close to Gk when at least two players are at most level‐k rational.  相似文献   

10.
Given a simple, undirected graph G=(V,E) and a weight function w:E→ℤ+, we consider the problem of orienting all edges in E so that the maximum weighted outdegree among all vertices is minimized. It has previously been shown that the unweighted version of the problem is solvable in polynomial time while the weighted version is (weakly) NP-hard. In this paper, we strengthen these results as follows: (1) We prove that the weighted version is strongly NP-hard even if all edge weights belong to the set {1,k}, where k is any fixed integer greater than or equal to 2, and that there exists no pseudo-polynomial time approximation algorithm for this problem whose approximation ratio is smaller than (1+1/k) unless P = NP; (2) we present a new polynomial-time algorithm that approximates the general version of the problem within a ratio of (2−1/k), where k is the maximum weight of an edge in G; (3) we show how to approximate the special case in which all edge weights belong to {1,k} within a ratio of 3/2 for k=2 (note that this matches the inapproximability bound above), and (2−2/(k+1)) for any k≥3, respectively, in polynomial time.  相似文献   

11.
The convex ordered median problem is a generalization of the median, the k-centrum or the center problem. The task of the associated inverse problem is to change edge lengths at minimum cost such that a given vertex becomes an optimal solution of the location problem, i.e., an ordered median. It is shown that the problem is NP-hard even if the underlying network is a tree and the ordered median problem is convex and either the vertex weights are all equal to 1 or the underlying problem is the k-centrum problem. For the special case of the inverse unit weight k-centrum problem a polynomial time algorithm is developed.  相似文献   

12.
It is widely known that when there are errors with a moving‐average root close to −1, a high order augmented autoregression is necessary for unit root tests to have good size, but that information criteria such as the AIC and the BIC tend to select a truncation lag (k) that is very small. We consider a class of Modified Information Criteria (MIC) with a penalty factor that is sample dependent. It takes into account the fact that the bias in the sum of the autoregressive coefficients is highly dependent on k and adapts to the type of deterministic components present. We use a local asymptotic framework in which the moving‐average root is local to −1 to document how the MIC performs better in selecting appropriate values of k. In Monte‐Carlo experiments, the MIC is found to yield huge size improvements to the DFGLS and the feasible point optimal PT test developed in Elliott, Rothenberg, and Stock (1996). We also extend the M tests developed in Perron and Ng (1996) to allow for GLS detrending of the data. The MIC along with GLS detrended data yield a set of tests with desirable size and power properties.  相似文献   

13.
连续双向拍卖市场中交易策略的设计问题远比单向拍卖复杂,本文首先检验了该市场中交易价格的马尔可夫性质,然后据此提出了基于马尔可夫链的自学习动态交易策略,最后通过比较实验发现,该策略明显优于"约束型零信息"策略。  相似文献   

14.
In previous study on comparing the makespan of the schedule allowed to be preempted at most i times and that of the optimal schedule with unlimited number of preemptions, the worst case ratio was usually obtained by analyzing the structures of the optimal schedules. For m identical machines case, the worst case ratio was shown to be 2m/(m+i+1) for any 0≤im?1 (Braun and Schmidt in SIAM J. Comput. 32(3):671–680, 2003), and they showed that LPT algorithm is an exact algorithm which can guarantee the worst case ratio for i=0. In this paper, we propose a simpler method which is based on the design and analysis of the algorithm and finding an instance in the worst case. It can not only obtain the worst case ratio but also give a linear algorithm which can guarantee this ratio for any 0≤im?1, and thus we generalize the previous results. We also make a discussion on the trade-off between the objective value and the number of preemptions. In addition, we consider the i-preemptive scheduling on two uniform machines. For both i=0 and i=1, we give two linear algorithms and present the worst-case ratios with respect to s, i.e., the ratio of the speeds of two machines.  相似文献   

15.
Independent sets, induced matchings and cliques are examples of regular induced subgraphs in a graph. In this paper, we prove that finding a maximum cardinality k-regular induced subgraph is an NP-hard problem for any fixed value of k. We propose a convex quadratic upper bound on the size of a k-regular induced subgraph and characterize those graphs for which this bound is attained. Finally, we extend the Hoffman bound on the size of a maximum 0-regular subgraph (the independence number) from k=0 to larger values of k.  相似文献   

16.
Minimum m-connected k-dominating set problem is as follows: Given a graph G=(V,E) and two natural numbers m and k, find a subset SV of minimal size such that every vertex in VS is adjacent to at least k vertices in S and the induced graph of S is m-connected. In this paper we study this problem with unit disc graphs and small m, which is motivated by the design of fault-tolerant virtual backbone for wireless sensor networks. We propose two approximation algorithms with constant performance ratios for m≤2. We also discuss how to design approximation algorithms for the problem with arbitrarily large m. This work was supported in part by the Research Grants Council of Hong Kong under Grant No. CityU 1165/04E, the National Natural Science Foundation of China under Grant No. 70221001, 10531070 and 10771209.  相似文献   

17.
We study (vertex-disjoint) packings of paths of length two (i.e., of P 2’s) in graphs under a parameterized perspective. Starting from a maximal P 2-packing ℘ of size j we use extremal combinatorial arguments for determining how many vertices of ℘ appear in some P 2-packing of size (j+1) (if such a packing exists). We prove that one can ‘reuse’ 2.5j vertices. We also show that this bound is asymptotically sharp. Based on a WIN-WIN approach, we build an algorithm which decides, given a graph, if a P 2-packing of size at least k exists in time O*(2.4483k)\mathcal{O}^{*}(2.448^{3k}) .  相似文献   

18.
We study the problem of (off-line) broadcast scheduling in minimizing total flow time and propose a dynamic programming approach to compute an optimal broadcast schedule. Suppose the broadcast server has k pages and the last page request arrives at time n. The optimal schedule can be computed in O(k3(n+k)k−1) time for the case that the server has a single broadcast channel. For m channels case, i.e., the server can broadcast m different pages at a time where m < k, the optimal schedule can be computed in O(nkm) time when k and m are constants. Note that this broadcast scheduling problem is NP-hard when k is a variable and will take O(nkm+1) time when k is fixed and m ≥ 1 with the straightforward implementation of the dynamic programming approach. The preliminary version of this paper appeared in Proceedings of the 11th Annual International Computing and Combinatorics Conference as “Off-line Algorithms for Minimizing the Total Flow Time in Broadcast Scheduling”.  相似文献   

19.
A k-chordalisation of a graph G = (V,E) is a graph H = (V,F) obtained by adding edges to G, such that H is a chordal graph with maximum clique size at most k. This note considers the problem: given a graph G = (V,E) which pairs of vertices, non-adjacent in G, will be an edge in every k-chordalisation of G. Such a pair is called necessary for treewidth k. An equivalent formulation is: which edges can one add to a graph G such that every tree decomposition of G of width at most k is also a tree decomposition of the resulting graph G. Some sufficient, and some necessary and sufficient conditions are given for pairs of vertices to be necessary for treewidth k. For a fixed k, one can find in linear time for a given graph G the set of all necessary pairs for treewidth k. If k is given as part of the input, then this problem is coNP-hard. A few similar results are given when interval graphs (and hence pathwidth) are used instead of chordal graphs and treewidth.  相似文献   

20.
拍卖不仅是种价格确定机制,也可以作为供应链的协调机制.论文研究不完全信息的采购环境下,通过应用合理的拍卖机制来协调买者与供应商之间的关系分析供应链的协调问题.分析了当有n个供应商1个买者的独立私人值环境下且市场反需求函数为二次函数时,批发价格拍卖、目录拍卖及二部合同拍卖3种机制为各方所产生的期望收入,并证明了在拥有信息中介的二部合同拍卖机制下系统可以达到渠道协调.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号