首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
We provide a possible explanation for the empirical puzzle that mergers often reduce profits, but raise share prices. If being an “insider” is better than being an “outsider”, firms may merge to preempt their partner merging with a rival. The insiders' stock market value is increased, since the risk of becoming an outsider is eliminated. These results are derived in an endogenous‐merger model, predicting the conditions under which mergers occur, when they occur, and how the surplus is shared. (JEL: L13, L41, G34, C78)  相似文献   

2.
This paper is about selection of neighbors in models of social interactions. I study a general equilibrium model of behavior subject to endogenous social influences when heterogeneous individuals can choose whom to associate with, buying associations on a “memberships market”. Social effects in behavior turn out to be a stratifying force: The desire for valuable interactions induces inefficient sorting and may lead to the endogenous emergence of “social traps”. The theory is then used to suggest identification strategies that may solve, in a micro‐founded way, identification and selection problems that typically affect empirical work on social interactions. Such strategies offer a viable alternative when valid instrumental variables or randomized experiments are not available. (JEL: C26, D85, Z13, Z19)  相似文献   

3.
This paper presents the results of a natural experiment conducted at a U.S. high‐tech manufacturer. The experiment had as its treatment the adoption, at a single point in time, of a comprehensive enterprise information system throughout the functional groups charged with customer order fulfillment. This information technology (it) adoption was not accompanied by substantial contemporaneous business process changes. Immediately after adoption, lead time and on‐time delivery performance suffered, causing a “performance dip” similar to those observed after the introduction of capital equipment onto shop floors. Lead times and on‐time delivery percentages then improved along a learning curve. After several months, performance in these areas improved significantly relative to preadoption levels. These observed performance patterns could not be well explained by rival causal factors such as order, production, and inventory volumes; head count; and new product introductions. Thus, this longitudinal research presents initial evidence of a causal link between IT adoption and subsequent improvement in operational performance measures, as well as evidence of the timescale over which these benefits appear.  相似文献   

4.
Nonseparable panel models are important in a variety of economic settings, including discrete choice. This paper gives identification and estimation results for nonseparable models under time‐homogeneity conditions that are like “time is randomly assigned” or “time is an instrument.” Partial‐identification results for average and quantile effects are given for discrete regressors, under static or dynamic conditions, in fully nonparametric and in semiparametric models, with time effects. It is shown that the usual, linear, fixed‐effects estimator is not a consistent estimator of the identified average effect, and a consistent estimator is given. A simple estimator of identified quantile treatment effects is given, providing a solution to the important problem of estimating quantile treatment effects from panel data. Bounds for overall effects in static and dynamic models are given. The dynamic bounds provide a partial‐identification solution to the important problem of estimating the effect of state dependence in the presence of unobserved heterogeneity. The impact of T, the number of time periods, is shown by deriving shrinkage rates for the identified set as T grows. We also consider semiparametric, discrete‐choice models and find that semiparametric panel bounds can be much tighter than nonparametric bounds. Computationally convenient methods for semiparametric models are presented. We propose a novel inference method that applies in panel data and other settings and show that it produces uniformly valid confidence regions in large samples. We give empirical illustrations.  相似文献   

5.
Humans are continuously exposed to chemicals with suspected or proven endocrine disrupting chemicals (EDCs). Risk management of EDCs presents a major unmet challenge because the available data for adverse health effects are generated by examining one compound at a time, whereas real‐life exposures are to mixtures of chemicals. In this work, we integrate epidemiological and experimental evidence toward a whole mixture strategy for risk assessment. To illustrate, we conduct the following four steps in a case study: (1) identification of single EDCs (“bad actors”)—measured in prenatal blood/urine in the SELMA study—that are associated with a shorter anogenital distance (AGD) in baby boys; (2) definition and construction of a “typical” mixture consisting of the “bad actors” identified in Step 1; (3) experimentally testing this mixture in an in vivo animal model to estimate a dose–response relationship and determine a point of departure (i.e., reference dose [RfD]) associated with an adverse health outcome; and (4) use a statistical measure of “sufficient similarity” to compare the experimental RfD (from Step 3) to the exposure measured in the human population and generate a “similar mixture risk indicator” (SMRI). The objective of this exercise is to generate a proof of concept for the systematic integration of epidemiological and experimental evidence with mixture risk assessment strategies. Using a whole mixture approach, we could find a higher rate of pregnant women under risk (13%) when comparing with the data from more traditional models of additivity (3%), or a compound‐by‐compound strategy (1.6%).  相似文献   

6.
Empirical evidence suggests that perfectionism can affect choice behavior. When striving for perfection, a person can desire to keep normatively appealing options feasible even if she persistently fails to use these options later. For instance, she can “pay not to go to the gym,” as in DellaVigna and Malmendier (2006). By contrast, some perfectionists may avoid normatively important tasks for fear of negative self‐evaluation of their performance. This paper models perfectionist behaviors in Gul and Pesendorfer's (2001) menu framework where agents may be tempted to deviate from their long‐term normative objectives. In addition to self‐control costs, I identify a utility component that reflects emotional costs and benefits of perfectionism. My model is derived from axioms imposed on preferences over menus in an essentially unique way.  相似文献   

7.
We analyze the effects of the unprecedented rise in trade between Germany and “the East” (China and Eastern Europe) in the period 1988–2008 on German local labor markets. Using detailed administrative data, we exploit the cross‐regional variation in initial industry structures and use trade flows of other high‐income countries as instruments for regional import and export exposure. We find that the rise of the East in the world economy caused substantial job losses in German regions specialized in import‐competing industries, both in manufacturing and beyond. Regions specialized in export‐oriented industries, however, experienced even stronger employment gains and lower unemployment. In the aggregate, we estimate that this trade integration has caused some 442,000 additional jobs in the economy and contributed to retaining the manufacturing sector in Germany. This is almost exclusively driven by the rise of Eastern Europe, not by China. We also conduct an analysis at the individual worker level, and find that trade had a stabilizing overall effect on employment relationships.  相似文献   

8.
9.
We show that efficient bargaining is impossible for a wide class of economic settings and property rights. These settings are characterized by (i) the existence of “adverse efficient opt‐out types”, whose participation does not change the efficient allocation and who, when they opt out, are the worst type other agents can face, and (ii) non‐existence of the “marginal core”, and its multivaluedness with a positive probability. We also examine the optimal allocation of property rights within a given class that satisfies (i), such as simple property rights, liability rules, and dual‐chooser rules. We characterize property rights that minimize the expected subsidy required to implement efficiency. With two agents, simple property rights that are optimal in this way maximize the expected surplus at the status quo allocation, but this no longer holds with more agents. We also study “second‐best” budget‐balanced bargaining under a liability rule. The optimal “second‐best” liability rule may differ from, but is often close to, the expectation of the victim's harm, which would be optimal if there were no bargaining. However, liability rules that are close to a simple property right result in a lower expected surplus than the simple property right they are near.  相似文献   

10.
The standard value of information approach of decision analysis assumes that the individual or agency that collects the information is also in control of the subsequent decisions based on the information. We refer to this situation as the “value of information with control (VOI‐C).” This paradigm leads to powerful results, for example, that the value of information cannot be negative and that it is zero, when the information cannot change subsequent decisions. In many real world situations, however, the agency collecting the information is different from the one that makes the decision on the basis of that information. For example, an environmental research group may contemplate to fund a study that can affect an environmental policy decision that is made by a regulatory organization. In this two‐agency formulation, the information‐acquiring agency has to decide, whether an investment in research is worthwhile, while not being in control of the subsequent decision. We refer to this situation as “value of information without control (VOI‐NC).” In this article, we present a framework for the VOI‐NC and illustrate it with an example of a specific problem of determining the value of a research program on the health effects of power‐frequency electromagnetic fields. We first compare the VOI‐C approach with the VOI‐NC approach. We show that the VOI‐NC can be negative, but that with high‐quality research (low probabilities of errors of type I and II) it is positive. We also demonstrate, both in the example and in more general mathematical terms, that the VOI‐NC for environmental studies breaks down into a sum of the VOI‐NC due to the possible reduction of environmental impacts and the VOI‐NC due to the reduction of policy costs, with each component being positive for low environmental impacts and high‐quality research. Interesting results include that the environmental and cost components of the VOI‐NC move in opposite directions as a function of the probability of environmental impacts and that VOI‐NC can be positive, even though the probability of environmental impacts is zero or one.  相似文献   

11.
This paper studies issues associated with designing process control systems when the testing equipment is subjected to random shifts. We consider a production process with two states: in control and out of control. The process may shift randomly to the out‐of‐control state over time. The process is monitored by periodically sampling finished items from the process. The equipment used to test sampled items also is assumed to have two states and may shift randomly during the testing process. We formulate a cost model for finding the optimal process control policy that minimizes the expected unit time cost. Numerical results show that shifts of the testing equipment may significantly affect the performance of a process control policy. We also studied the effects of the testing equipment's shifts on the selection of process control policies.  相似文献   

12.
Stricter laws require more incisive and costlier enforcement. Because enforcement activity depends both on available tax revenue and the honesty of officials, the optimal legal standard of a benevolent government is increasing in per capita income and decreasing in officials' corruption. In contrast to the “tollbooth view” of regulation, the standard chosen by a self‐interested government is a non‐monotonic function of officials' corruption, and can be either lower or higher than that chosen by a benevolent regulator. International evidence on environmental regulation shows that standards correlate positively with per‐capita income, and negatively with corruption, consistent with the model's predictions for benevolent governments. (JEL: D73, K42, L51)  相似文献   

13.
We introduce incomplete contracts in a model where multinational firms from a certain country (“North”) can decide to serve a foreign market (“South”) through exports or through horizontal foreign direct investment (FDI). FDI relies on the supply of specialized intermediate inputs that could be supplied either by northern suppliers or by suppliers located in South. Intermediate sourcing contracts are complete in North but not in South. Were southern contracts also complete, FDI would arise only when trade barriers are high enough. Incomplete contracts in South generate, instead, a non‐linear relation between trade barriers and FDI as foreign investment emerges also when trade barriers are low enough. The reason is the positive effect that low trade barriers have on the bargaining power of final producers with respect to their southern suppliers. (JEL: F23, F12)  相似文献   

14.
Risk aversion (a second‐order risk preference) is a time‐proven concept in economic models of choice under risk. More recently, the higher order risk preferences of prudence (third‐order) and temperance (fourth‐order) also have been shown to be quite important. While a majority of the population seems to exhibit both risk aversion and these higher order risk preferences, a significant minority does not. We show how both risk‐averse and risk‐loving behaviors might be generated by a simple type of basic lottery preference for either (1) combining “good” outcomes with “bad” ones, or (2) combining “good with good” and “bad with bad,” respectively. We further show that this dichotomy is fairly robust at explaining higher order risk attitudes in the laboratory. In addition to our own experimental evidence, we take a second look at the extant laboratory experiments that measure higher order risk preferences and we find a fair amount of support for this dichotomy. Our own experiment also is the first to look beyond fourth‐order risk preferences, and we examine risk attitudes at even higher orders.  相似文献   

15.
I recently discussed pitfalls in attempted causal inference based on reduced‐form regression models. I used as motivation a real‐world example from a paper by Dr. Sneeringer, which interpreted a reduced‐form regression analysis as implying the startling causal conclusion that “doubling of [livestock] production leads to a 7.4% increase in infant mortality.” This conclusion is based on: (A) fitting a reduced‐form regression model to aggregate (e.g., county‐level) data; and (B) (mis)interpreting a regression coefficient in this model as a causal coefficient, without performing any formal statistical tests for potential causation (such as conditional independence, Granger‐Sims, or path analysis tests). Dr. Sneeringer now adds comments that confirm and augment these deficiencies, while advocating methodological errors that, I believe, risk analysts should avoid if they want to reach logically sound, empirically valid, conclusions about cause and effect. She explains that, in addition to (A) and (B) above, she also performed other steps such as (C) manually selecting specific models and variables and (D) assuming (again, without testing) that hand‐picked surrogate variables are valid (e.g., that log‐transformed income is an adequate surrogate for poverty). In her view, these added steps imply that “critiques of A and B are not applicable” to her analysis and that therefore “a causal argument can be made” for “such a strong, robust correlation” as she believes her regression coefficient indicates. However, multiple wrongs do not create a right. Steps (C) and (D) exacerbate the problem of unjustified causal interpretation of regression coefficients, without rendering irrelevant the fact that (A) and (B) do not provide evidence of causality. This reply focuses on whether any statistical techniques can produce the silk purse of a valid causal inference from the sow's ear of a reduced‐form regression analysis of ecological data. We conclude that Dr. Sneeringer's analysis provides no valid indication that air pollution from livestock operations causes any increase in infant mortality rates. More generally, reduced‐form regression modeling of aggregate population data—no matter how it is augmented by fitting multiple models and hand‐selecting variables and transformations—is not adequate for valid causal inference about health effects caused by specific, but unmeasured, exposures.  相似文献   

16.
The present study investigates U.S. Department of Agriculture inspection records in the Agricultural Quarantine Activity System database to estimate the probability of quarantine pests on propagative plant materials imported from various countries of origin and to develop a methodology ranking the risk of country–commodity combinations based on quarantine pest interceptions. Data collected from October 2014 to January 2016 were used for developing predictive models and validation study. A generalized linear model with Bayesian inference and a generalized linear mixed effects model were used to compare the interception rates of quarantine pests on different country–commodity combinations. Prediction ability of generalized linear mixed effects models was greater than that of generalized linear models. The estimated pest interception probability and confidence interval for each country–commodity combination was categorized into one of four compliance levels: “High,” “Medium,” “Low,” and “Poor/Unacceptable,” Using K‐means clustering analysis. This study presents risk‐based categorization for each country–commodity combination based on the probability of quarantine pest interceptions and the uncertainty in that assessment.  相似文献   

17.
DiMaggio and Powell (1983) argued that organizations, in their quest for legitimacy, are subjected to isomorphic pressures which produce increasing similarity among peer organizations over time: “Once an organizational field becomes well established ... there is an inexorable push toward homogenization.” Yet, in contradiction to this “iron cage” hypothesis, many industries became more heterogeneous, not more homogeneous, in their profiles during the latter decades of the twentieth century, particularly between about 1980 and 2000 (at least on the American landscape). Why didn’t “inexorable homogenization” occur? We argue that DiMaggio and Powell were correct about the forces that give rise to isomorphism but failed to anticipate several major macrosocial trends that caused those forces all to move in directions that diminished, rather than accentuated, isomorphism. For example, DiMaggio and Powell argued that ambiguity about goals will propel isomorphic change; but the goals for publicly-traded U.S. corporations became less ambiguous. They hypothesized that the fewer the alternative organizational models in a field, the faster the rate of isomorphism; but the array of organizational models increased significantly. We empirically illustrate the increased heterogeneity that occurred within American industries by tracing the trend toward divergence – on several dimensions of strategy and performance – within the steel industry. An analysis of 18 additional industries similarly yields far more evidence of increased heterogeneity than of increased homogeneity over the latter decades of the twentieth century. We go on to argue that reduced isomorphic pressures not only engendered greater intraindustry variety, but also increased managerial discretion, which contributed greatly to the romanticization of CEOs that occurred during the period 1980–2000.  相似文献   

18.
Operations managers clearly play a critical role in targeting plant‐level investments toward environment and safety practices. In principle, a “rational” response would be to align this investment with senior management's competitive goals for operational performance. However, operations managers also are influenced by contingent factors, such as their national culture, thus creating potential tension that might bias investment away from a simple rational response. Using data from 1,453 plants in 24 countries, we test the moderating influence of seven of the national cultural characteristics on investment at the plant level in environment and safety practices. Four of the seven national cultural characteristics from GLOBE (i.e., uncertainty avoidance, in‐group collectivism, future orientation and performance orientation) shifted investment away from an expected “rational” response. Positive bias was evident when the national culture favored consistency and formalized procedures and rewarded performance improvement. In contrast, managers exhibited negative bias when familial groups and local coalitions were powerful, or future outcomes—rather than current actions—were more important. Overall, this study highlights the critical importance of moving beyond a naïve expectation that plant‐level investment will naturally align with corporate competitive goals for environment and safety. Instead, the national culture where the plant is located will influence these investments, and must be taken into account by senior management.  相似文献   

19.
We studied time‐based policies on pricing and leadtime for a build‐to‐order and direct sales manufacturer. It is assumed that the utility of the product varies among potential customers and decreases over time, and that a potential customer will place an order if his or her utility is higher than the manufacturer's posted price. Once an order is placed, it will be delivered to the customer after a length of time called “leadtime.” Because of the decrease in a customer's utility during leadtime, a customer will cancel the order if the utility falls below the ordering price before the order is received. The manufacturer may choose to offer discounted prices to customers who would otherwise cancel their orders. We discuss two price policies: common discounted price and customized discounted price. In the common discounted price policy, the manufacturer offers a single lower price to the customers; in the customized discounted price policy, the manufacturer offers the customers separately for individual new prices. Our analytical and numerical studies show that the discounted price policies results in higher revenue and that the customized discounted price policy significantly outperforms the common discounted price policy when product utility decreases rapidly. We also study two leadtime policies when production cost decreases over time. The first uses a fixed leadtime, and the second allows the leadtime to vary dynamically over time. We find that the dynamic leadtime policy significantly outperforms the fixed leadtime policy when the product cost decreases rapidly.  相似文献   

20.
The transition from economic stagnation to sustained growth is often modeled thanks to “population‐induced” productivity improvements, which are assumed rather than derived from primary assumptions. In this paper the effect of population on productivity is derived from optimal behavior. More precisely, both the number and location of education facilities are chosen optimally by municipalities. Individuals determine their education investment depending on the distance to the nearest school, and also on technical progress and longevity. In this setting, higher population density enables the set‐up costs of additional schools to be covered, opening the possibility to reach higher educational levels. Using counterfactual experiments we find that one‐third of the rise in literacy can be directly attributed to the effect of density, and one‐sixth is linked to higher longevity. Moreover, the effect of population density in the model is consistent with the available evidence for England, where it is shown that schools were established at a high rate over the period 1540–1620. (JEL: O41, I21, R12, J11)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号