首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 828 毫秒
1.
Alliances between competitors in which established firms provide access to proprietary resources—for example, their distribution channels—are important business practices. We analyze a market where an established firm, firm A, produces a product of well‐known quality, and a firm with an unknown brand, firm B, has to choose to produce high or low quality. Firm A observes firm B's quality choice but consumers do not. Hence, firm B is subject to a moral hazard problem which can potentially be solved by firm A. Firm A can accept or reject to form an alliance with firm B, which is observed by consumers. If an alliance is formed, firm A implicitly certifies the rival's product. Consumers infer that firm B is a competitor with high quality, because otherwise why would the established firm accept to form an alliance? The mechanism we discover allows for an economic interpretation of several types of business practices. (JEL: L15, L13, L24, L42, M21, M31, D43)  相似文献   

2.
The complexity of the Bandpass problem is re-investigated. Specifically, we show that the problem with any fixed bandpass number B≥2 is NP-hard. Next, a row stacking algorithm is proposed for the problem with three columns, which produces a solution that is at most 1 less than the optimum. For the special case B=2, the row stacking algorithm guarantees an optimal solution. On approximation, for the general problem, we present an O(B 2)-algorithm, which reduces to a 2-approximation algorithm for the special case B=2.  相似文献   

3.
William K. Boyes 《Risk analysis》2011,31(12):1935-1948
Acute solvent exposures may contribute to automobile accidents because they increase reaction time and decrease attention, in addition to impairing other behaviors. These effects resemble those of ethanol consumption, both with respect to behavioral effects and neurological mechanisms. These observations, along with the extensive data on the relationship between ethanol consumption and fatal automobile accidents, suggested a way to estimate the probability of fatal automobile accidents from solvent inhalation. The problem can be approached using the logic of the algebraic transitive postulate of equality: if A=B and B=C, then A=C. We first calculated a function describing the internal doses of solvent vapors that cause the same magnitude of behavioral impairment as ingestion of ethanol (A=B). Next, we fit a function to data from the literature describing the probability of fatal car crashes for a given internal dose of ethanol (B=C). Finally, we used these two functions to generate a third function to estimate the probability of a fatal car crash for any internal dose of organic solvent vapor (A=C). This latter function showed quantitatively (1) that the likelihood of a fatal car crash is increased by acute exposure to organic solvent vapors at concentrations less than 1.0 ppm, and (2) that this likelihood is similar in magnitude to the probability of developing leukemia from exposure to benzene. This approach could also be applied to other potentially adverse consequences of acute exposure to solvents (e.g., nonfatal car crashes, property damage, and workplace accidents), if appropriate data were available.  相似文献   

4.
Should capacitated firms set prices responsively to uncertain market conditions in a competitive environment? We study a duopoly selling differentiated substitutable products with fixed capacities under demand uncertainty, where firms can either commit to a fixed price ex ante, or elect to price contingently ex post, e.g., to charge high prices in booming markets, and low prices in slack markets. Interestingly, we analytically show that even for completely symmetric model primitives, asymmetric equilibria of strategic pricing decisions may arise, in which one firm commits statically and the other firm prices contingently; in this case, there also exists a unique mixed strategy equilibrium. Such equilibrium behavior tends to emerge, when capacity is ampler, and products are less differentiated or demand uncertainty is lower. With asymmetric fixed capacities, if demand uncertainty is low, a unique asymmetric equilibrium emerges, in which the firm with more capacity chooses committed pricing and the firm with less capacity chooses contingent pricing. We identify two countervailing profit effects of contingent pricing under competition: gains from responsively charging high price under high demand, and losses from intensified price competition under low demand. It is the latter detrimental effect that may prevent both firms from choosing a contingent pricing strategy in equilibrium. We show that the insights remain valid when capacity decisions are endogenized. We caution that responsive price changes under aggressive competition of less differentiated products can result in profit‐killing discounting.  相似文献   

5.
A combinatorial optimization problem, called the Bandpass Problem, is introduced. Given a rectangular matrix A of binary elements {0,1} and a positive integer B called the Bandpass Number, a set of B consecutive non-zero elements in any column is called a Bandpass. No two bandpasses in the same column can have common rows. The Bandpass problem consists of finding an optimal permutation of rows of the matrix, which produces the maximum total number of bandpasses having the same given bandpass number in all columns. This combinatorial problem arises in considering the optimal packing of information flows on different wavelengths into groups to obtain the highest available cost reduction in design and operating the optical communication networks using wavelength division multiplexing technology. Integer programming models of two versions of the bandpass problems are developed. For a matrix A with three or more columns the Bandpass problem is proved to be NP-hard. For matrices with two or one column a polynomial algorithm solving the problem to optimality is presented. For the general case fast performing heuristic polynomial algorithms are presented, which provide near optimal solutions, acceptable for applications. High quality of the generated heuristic solutions has been confirmed in the extensive computational experiments. As an NP-hard combinatorial optimization problem with important applications the Bandpass problem offers a challenge for researchers to develop efficient computational solution methods. To encourage the further research a Library of Bandpass Problems has been developed. The Library is open to public and consists of 90 problems of different sizes (numbers of rows, columns and density of non-zero elements of matrix A and bandpass number B), half of them with known optimal solutions and the second half, without.  相似文献   

6.
Batch-Processing Scheduling with Setup Times   总被引:2,自引:0,他引:2  
The problem is to minimize the total weighted completion time on a single batch-processing machine with setup times. The machine can process a batch of at most B jobs at one time, and the processing time of a batch is given by the longest processing time among the jobs in the batch. The setup time of a batch is given by the largest setup time among the jobs in the batch. This batch-processing problem reduces to the ordinary uni-processor scheduling problem when B = 1. In this paper we focus on the extreme case of B = +, i.e. a batch can contain any number of jobs. We present in this paper a polynomial-time approximation algorithm for the problem with a performance guarantee of 2. We further show that a special case of the problem can be solved in polynomial time.  相似文献   

7.
We analyze the identification and estimation of parameters β satisfying the incomplete linear moment restrictions E(z(y)) = E(zu(z)), where z is a set of instruments and u(z) an unknown bounded scalar function. We first provide empirically relevant examples of such a setup. Second, we show that these conditions set identify β where the identified set B is bounded and convex. We provide a sharp characterization of the identified set not only when the number of moment conditions is equal to the number of parameters of interest, but also in the case in which the number of conditions is strictly larger than the number of parameters. We derive a necessary and sufficient condition of the validity of supernumerary restrictions which generalizes the familiar Sargan condition. Third, we provide new results on the asymptotics of analog estimates constructed from the identification results. When B is a strictly convex set, we also construct a test of the null hypothesis, β0B, whose size is asymptotically correct and which relies on the minimization of the support function of the set B− {β0}. Results of some Monte Carlo experiments are presented.  相似文献   

8.
The paper presents an extension of decision theory to the analysis of social power. The power of a person, A, over another person, B, is viewed in terms of the effect A has on B's decision. The analysis is based on the idea that B's decision regarding the performance of alternative behaviors is a function of 1) B's utility for the consequences of the behaviors and 2) B's subjective probabilities that the behaviors will lead to these consequences. In these terms, A's power over B lies in A's ability to mediate various consequences for B, contingent upon B's compliance or noncompliance. Subjects were asked to consider eight situations in which hypothetical individuals had to make a choice between two courses of action. In each situation another person (A) was attempting to induce the hypothetical individual (B) to choose one of the alternatives, while various situational factors were influencing B to choose the other alternative. The subjects were asked to consider B's utilities and subjective probabilities in each situation and to indicate whether or not B should comply with A and to make ratings of A's power. The decision theory analysis did well in predicting whether or not subjects would indicate that B should comply with A. Also, subjects generally were able to correctly specify whether A or the situational factors had more influence over B's decision. Finally, the subjects' ratings of A's power in the eight situations were highly related to the decision theoretic measure of power.  相似文献   

9.
It is widely believed by the American public that quality education is an unattainable goal in American elementary and secondary schools. A recent survey of twelfth grade students in the thirteen leading developed countries showed that American students ranked thirteenth. Another recent survey concluded that 25 million of our citizens are functionally illiterate and that an additional 25 million have to update their skills and/or knowledge to remain competitive in today's marketplace. The federal government and some state governments have finally recognized the relatively poor condition of our present educational system and have set initiatives to corect the educational crisis that exists. The authors describe former President Bush's “America 2000: An Educational Strategy” and President Clinton's “GOALS 2000: Educate America”, both of which have desirable goals but lack an approach for reaching the target and fail to specify any type of accountability for non-achievement. The authors then compare the American education system to an industrial or service organization and attempt to define the “customer”. Once the “customer” is identified, Deming's principles for management are shown to apply to education as well as to manufacturing and other service organizations. In developing this paper, the authors focus on:
  1. government, initiatives to improve the educational environment,
  2. the views of leading experts on the applicability of total quality management concepts to education, and
  3. the strides that have been made toward educational quality improvement by some schools.
A case study describing the benefits which have resulted from implementation of a total quality system at an inner-city suburban New York City K-12 school is then presented.  相似文献   

10.
Risk assessment is the process of estimating the likelihood that an adverse effect may result from exposure to a specific health hazard. The process traditionally involves hazard identification, dose-response assessment, exposure assessment, and risk characterization to answer “How many excess cases of disease A will occur in a population of size B due to exposure to agent C at dose level D?” For natural hazards, however, we modify the risk assessment paradigm to answer “How many excess cases of outcome Y will occur in a population of size B due to natural hazard event E of severity D?” Using a modified version involving hazard identification, risk factor characterization, exposure characterization, and risk characterization, we demonstrate that epidemiologic modeling and measures of risk can quantify the risks from natural hazard events. We further extend the paradigm to address mitigation, the equivalent of risk management, to answer “What is the risk for outcome Y in the presence of prevention intervention X relative to the risk for Y in the absence of X?” We use the preventable fraction to estimate the efficacy of mitigation, or reduction in adverse health outcomes as a result of a prevention strategy under ideal circumstances, and further estimate the effectiveness of mitigation, or reduction in adverse health outcomes under typical community-based settings. By relating socioeconomic costs of mitigation to measures of risk, we illustrate that prevention effectiveness is useful for developing cost-effective risk management options.  相似文献   

11.
This paper analyzes the complexity of the contraction fixed point problem: compute an ε‐approximation to the fixed point V*Γ(V*) of a contraction mapping Γ that maps a Banach space Bd of continuous functions of d variables into itself. We focus on quasi linear contractions where Γ is a nonlinear functional of a finite number of conditional expectation operators. This class includes contractive Fredholm integral equations that arise in asset pricing applications and the contractive Bellman equation from dynamic programming. In the absence of further restrictions on the domain of Γ, the quasi linear fixed point problem is subject to the curse of dimensionality, i.e., in the worst case the minimal number of function evaluations and arithmetic operations required to compute an ε‐approximation to a fixed point V*∈Bd increases exponentially in d. We show that the curse of dimensionality disappears if the domain of Γ has additional special structure. We identify a particular type of special structure for which the problem is strongly tractable even in the worst case, i.e., the number of function evaluations and arithmetic operations needed to compute an ε‐approximation of V* is bounded by Cεp where C and p are constants independent of d. We present examples of economic problems that have this type of special structure including a class of rational expectations asset pricing problems for which the optimal exponent p1 is nearly achieved.  相似文献   

12.
This paper considers the problem of choosing the number of bootstrap repetitions B for bootstrap standard errors, confidence intervals, confidence regions, hypothesis tests, p‐values, and bias correction. For each of these problems, the paper provides a three‐step method for choosing B to achieve a desired level of accuracy. Accuracy is measured by the percentage deviation of the bootstrap standard error estimate, confidence interval length, test's critical value, test's p‐value, or bias‐corrected estimate based on B bootstrap simulations from the corresponding ideal bootstrap quantities for which B=. The results apply quite generally to parametric, semiparametric, and nonparametric models with independent and dependent data. The results apply to the standard nonparametric iid bootstrap, moving block bootstraps for time series data, parametric and semiparametric bootstraps, and bootstraps for regression models based on bootstrapping residuals. Monte Carlo simulations show that the proposed methods work very well.  相似文献   

13.
In Becker's (1973) neoclassical marriage market model, matching is positively assortaive if types are complements: i.e., match output f(x, y) is supermodular in x and y. We reprise this famous result assuming time‐intensive partner search and transferable output. We prove existence of a search equilibrium with a continuum of types, and then characterize matching. After showing that Becker's conditions on match output no longer suffice for assortative matching, we find sufficient conditions valid for any search frictions and type distribution: supermodularity not only of output f, but also of log fx and log fxy. Symmetric submodularity conditions imply negatively assortative matching. Examples show these conditions are necessary.  相似文献   

14.
We study a combinatorial problem motivated by a receiver-oriented model of TCP traffic from Istrate et al. (2006), that incorporates information on both arrival times, and the dynamics of packet IDs. An important component of this model is a many-to-one mapping FB from sequences of IDs into a sequence of buffer sizes. We show that: i) Given a buffer sequence B, constructing a sequence A of IDs that belongs to the preimage of B is no harder than finding matchings in bipartite graph. ii) Counting the number of sequences A of packet IDs that belong to the preimage of B can be done in linear time in the special case when there exists a constant upper bound on the maximum entry in B. iii) This problem also has a fully polynomial randomized approximation scheme when we have a constant upper bound on the number of repeats in the packet sequences in the preimage. We also provide experimental evidence that the two previous results suffice to efficiently count the number of preimages for buffer sequences observed in real TCP data.  相似文献   

15.
We investigate the implications of collective and individual producer responsibility (CPR and IPR, respectively) models of product take‐back laws for e‐waste on manufacturers’ design for product recovery (DfR) choices and profits, and on consumer surplus in the presence of product competition. We show that IPR offers superior DfR incentives as compared to CPR, and provides a level competitive ground. CPR may distort competition and allow free‐riding on DfR efforts to reduce product recovery costs. Thus, manufacturer preferences for IPR or CPR may differ because of the free‐riding implications under CPR, with even high‐end manufacturers having incentives to free‐ride under certain competitive conditions. The policy choice between IPR and CPR is not clear cut from an economic welfare perspective. This choice involves a comparison between the effects of superior recovery cost reduction through improved DfR under IPR and the operational cost‐efficiency under CPR.  相似文献   

16.
This paper considers regression models for cross‐section data that exhibit cross‐section dependence due to common shocks, such as macroeconomic shocks. The paper analyzes the properties of least squares (LS) estimators in this context. The results of the paper allow for any form of cross‐section dependence and heterogeneity across population units. The probability limits of the LS estimators are determined, and necessary and sufficient conditions are given for consistency. The asymptotic distributions of the estimators are found to be mixed normal after recentering and scaling. The t, Wald, and F statistics are found to have asymptotic standard normal, χ2, and scaled χ2 distributions, respectively, under the null hypothesis when the conditions required for consistency of the parameter under test hold. However, the absolute values of t, Wald, and F statistics are found to diverge to infinity under the null hypothesis when these conditions fail. Confidence intervals exhibit similarly dichotomous behavior. Hence, common shocks are found to be innocuous in some circumstances, but quite problematic in others. Models with factor structures for errors and regressors are considered. Using the general results, conditions are determined under which consistency of the LS estimators holds and fails in models with factor structures. The results are extended to cover heterogeneous and functional factor structures in which common factors have different impacts on different population units.  相似文献   

17.
We study minimum-cost sensor placement on a bounded 3D sensing field, R, which comprises a number of discrete points that may or may not be grid points. Suppose we have ℓ types of sensors available with different sensing ranges and different costs. We want to find, given an integer σ ≥ 1, a selection of sensors and a subset of points to place these sensors such that every point in R is covered by at least σ sensors and the total cost of the sensors is minimum. This problem is known to be NP-hard. Let ki denote the maximum number of points that can be covered by a sensor of the ith type. We present in this paper a polynomial-time approximation algorithm for this problem with a proven approximation ratio . In applications where the distance of any two points has a fixed positive lower bound, each ki is a constant, and so we have a polynomial-time approximation algorithms with a constant guarantee. While γ may be large, we note that it is only a worst-case upper bound. In practice the actual approximation ratio is small, even on randomly generated points that do not have a fixed positive minimum distance between them. We provide a number of numerical results for comparing approximation solutions and optimal solutions, and show that the actual approximation ratios in these examples are all less than 3, even though γ is substantially larger. This research was supported in part by NSF under grant CCF-04080261 and by NSF of China under grant 60273062.  相似文献   

18.
We characterize optimal mechanisms for the multiple‐good monopoly problem and provide a framework to find them. We show that a mechanism is optimal if and only if a measure μ derived from the buyer's type distribution satisfies certain stochastic dominance conditions. This measure expresses the marginal change in the seller's revenue under marginal changes in the rent paid to subsets of buyer types. As a corollary, we characterize the optimality of grand‐bundling mechanisms, strengthening several results in the literature, where only sufficient optimality conditions have been derived. As an application, we show that the optimal mechanism for n independent uniform items each supported on [c,c+1] is a grand‐bundling mechanism, as long as c is sufficiently large, extending Pavlov's result for two items Pavlov, 2011. At the same time, our characterization also implies that, for all c and for all sufficiently large n, the optimal mechanism for n independent uniform items supported on [c,c+1] is not a grand‐bundling mechanism.  相似文献   

19.
On lazy bureaucrat scheduling with common deadlines   总被引:1,自引:1,他引:0  
Lazy bureaucrat scheduling is a new class of scheduling problems introduced by Arkin et al. (Inf. Comput. 184:129–146, 2003). In this paper we focus on the case where all the jobs share a common deadline. Such a problem is denoted as CD-LBSP, which has been considered by Esfahbod et al. (Algorithms and data structures. Lecture notes in computer science, vol. 2748, pp. 59–66, 2003). We first show that the worst-case ratio of the algorithm SJF (Shortest Job First) is two under the objective function [min-time-spent], and thus answer an open question posed in (Esfahbod, et al. in Algorithms and data structures. Lecture notes in computer science, vol. 2748, pp. 59–66, 2003). We further present two approximation schemes A k and B k both having worst-case ratio of , for any given integer k>0, under the objective functions [min-makespan] and [min-time-spent], respectively. Finally, we prove that the problem CD-LBSP remains NP-hard under several objective functions, even if all jobs share the same release time. A preliminary version of the paper appeared in Proceedings of the 7th Latin American Symposium on Theoretical Informatics, pp 515–523, 2006. Research of G. Zhang supported in part by NSFC (60573020).  相似文献   

20.
A pilot study of an interactive hazards education program was carried out in Canberra (Australia), with direct input from youth participants. Effects were evaluated in relation to youths’ interest in disasters, motivation to prepare, risk awareness, knowledge indicators, perceived preparedness levels, planning and practice for emergencies, and fear and anxiety indicators. Parents also provided ratings, including of actual home‐based preparedness activities. Using a single group pretest‐posttest with benchmarking design, a sample of 20 youths and their parents from a low SES community participated. Findings indicated beneficial changes on a number of indicators. Preparedness indicators increased significantly from pre‐ to posttest on both youth (p < 0.01) and parent ratings (p < 0.01). Parent ratings reflected an increase of just under six home‐based preparedness activities. Youth knowledge about disaster mitigation also was seen to increase significantly (p < 0.001), increasing 39% from pretest levels. While personalized risk perceptions significantly increased (p < 0.01), anxiety and worry levels were seen either not to change (generalized anxiety, p > 0.05) or to reduce between pre‐ and posttest (hazards‐specific fears, worry, and distress, ps ranged from p < 0.05 to < 0.001). In terms of predictors of preparedness, a number of variables were found to predict posttest preparedness levels, including information searching done by participants between education sessions. These pilot findings are the first to reflect quasi‐experimental outcomes for a youth hazards education program carried out in a setting other than a school that focused on a sample of youth from a low SES community.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号