首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
How much discretion should the monetary authority have in setting its policy? This question is analyzed in an economy with an agreed‐upon social welfare function that depends on the economy's randomly fluctuating state. The monetary authority has private information about that state. Well designed rules trade off society's desire to give the monetary authority discretion to react to its private information against society's need to prevent that authority from giving in to the temptation to stimulate the economy with unexpected inflation, the time inconsistency problem. Although this dynamic mechanism design problem seems complex, its solution is simple: legislate an inflation cap. The optimal degree of monetary policy discretion turns out to shrink as the severity of the time inconsistency problem increases relative to the importance of private information. In an economy with a severe time inconsistency problem and unimportant private information, the optimal degree of discretion is none.  相似文献   

2.
A comparison of the effects of exogenous shocks to global crude oil production on seven major industrialized economies suggests a fair degree of similarity in the real growth responses. An exogenous oil supply disruption typically causes a temporary reduction in real GDP growth that is concentrated in the second year after the shock. Inflation responses are more varied. The median CPI inflation response peaks after three to four quarters. Exogenous oil supply disruptions need not generate sustained inflation or stagflation. Typical responses include a fall in the real wage, higher short‐term interest rates, and a depreciating currency with respect to the dollar. Despite many qualitative similarities, there is strong statistical evidence that the responses to exogenous oil supply disruptions differ across G7 countries. For suitable subsets of countries, homogeneity cannot be ruled out. A counterfactual historical exercise suggests that the evolution of CPI inflation in the G7 countries would have been similar overall to the actual path even in the absence of exogenous shocks to oil production, consistent with a monetary explanation of the inflation of the 1970s. There is no evidence that the 1973–1974 and 2002–2003 oil supply shocks had a substantial impact on real growth in any G7 country, whereas the 1978–1979, 1980, and 1990–1991 shocks contributed to lower growth in at least some G7 countries. (JEL: E31, E32, Q43)  相似文献   

3.
Bas Ter Weel 《LABOUR》2003,17(3):361-382
Abstract. For many OECD countries an increase in wage inequality has been documented since the early 1980s. This is often attributed to a general rise in the demand for skilled workers resulting from recent technological change. Using the Organization for Strategic Labour Market Research (OSA) Labour Supply data, this paper studies the wage structure in the Netherlands over the period 1986–98 and demonstrates that wage inequality did not increase to any significant extent in the Netherlands. Using the accounting framework proposed by Juhn et al. (Journal of Political Economy 101: 410–442, 1993), it is shown that the relatively stable wage structure until at least the late 1990s can be attributed mainly to returns to observable components, such as education and experience, while residual wage inequality is found to be of minor importance in explaining the Dutch wage structure. These estimates suggest that the demand for skill in the Netherlands is likely not to have been rising to the extent it did in many other countries over this period.  相似文献   

4.
Hendrik Jürges 《LABOUR》2003,17(4):489-518
Abstract. Using German panel data, I examine the long‐term development in satisfaction with work from 1984 until 2001. As was the case for many other industrialized countries, Germany witnessed a sharp decline in workers’ self‐reported job satisfaction in the late 1980s and 1990s, the reason of which is yet unknown. I present a cohort analysis of job satisfaction using various identifying assumptions to examine several explanations for this phenomenon: pure cohort effects, a decrease in self‐reported job security, an increase in stress at work and a deterioration in other job conditions, and possible survey artefacts such as interviewer or repeated measurement effects. However, none of these can explain the overall decline in job satisfaction.  相似文献   

5.
This paper is based on data collected in the late 1980s and again in the late 1990s from interviews with chairmen, chief executives and board members in 12 large UK organizations such as Hanson, Marks & Spencer, Prudential and Glynwed. Although the primary focus is on theorizing and theory over time, this also leads us to question matters of method and methodology. The first section considers some of the study design issues raised by conducting this sequel study, noting that it was not possible to ‘repeat’ the first study for a number of important reasons. The second section observes that while our earlier analytical metaphor of organizing as explaining endures, the nature of the explanations has changed: ‘strategic focus’, ‘shareholder value’ and ‘corporate governance’ are now the contemporary watchwords although were unheard of in our interviews a decade earlier. The following section develops on this, concluding that in making judgements about future shareholder value, the primary evidence is drawn from events already past and interpreted through current explanations. We conclude on the importance of time to our theorizing, where there appears to be a confluence between time and person, in part, created and in part, supported by particular (judgements of) explanations of organizing prevailing at that time.  相似文献   

6.
Abstract

There has been a growing debate about the role of history in management research with several authors making suggestions on how to bring the two (back) together and others even highlighting the need for a “historic turn”. What we argue in this paper is that, while history was indeed sidelined by the scientization of management since the late 1950s, it started to make a comeback from the 1980s onwards and is increasingly employed in a number of research programs. We stress that the crucial question for management scholars engaging with history (or wanting to do so) is how it relates to theory. First of all, we present a systematic overview of the way history has been used—both at the micro (organizational) and macro-levels of analysis—distinguishing between what we refer to as “history to theory” and “history in theory”. In the former, we consider those research programs, such as (neo-)institutionalism, where history serves as evidence to develop, modify or test theories. In the case of “history in theory” we identify research programs where history or the past are part of the theoretical model itself as a driver or moderator, with “imprinting” as a prime example. Second, we also identify a growing number of studies that go further by displaying what we call “historical cognizance” in the sense of incorporating period effects or historical contingencies into their theorizing efforts. Finally, drawing on our broad overview, we make more specific suggestions for increasing the visibility and influence of history in organization and management theory.  相似文献   

7.
Abstract

Applications of behavior analysis in the private sector became visible in the late 1960s and early 1970s. By the 1980s, the field of Organizational Behavior Management (OBM) was a well established discipline. This article chronicles the people, events and publications that contributed to the formation of the field, beginning with the precursors in the 1950s and ending in the early 1980s. The contributions of individuals who have been honored by the OBM Network are detailed and emphasized. Although some historical accounts attribute the development of OBM to influences from traditional management fields, the present account, through documentation of the formative events, argues that the field developed in relative isolation from such influences, emanating primarily from Skinner's development of programmed instruction and the advent of behavioral applications in other settings. While application of psychology to the work place predated behavioral involvement, the primary force for the development and growth of OBM came from within the field of behavior analysis.  相似文献   

8.
Reforming the public sector has been on the agenda of nations throughout the world since the late 1970s. Fiji is no exception. It has embarked on reforming its commercial and industrial enterprises since the late 1980s. The government of Fiji has reformed most of its enterprises with an avowed objective of enhancing profitability, productivity, efficiency and accountability. This paper makes an attempt to share some of the experiences of public enterprise reform process in Fiji. It aims to analyze the background, process, contents and impact of the reform and examine the factors impeding the reform program. It highlights that (a) both internal and external factors were responsible for introducing reforms; (b) the reform efforts have not been able to produce desired results; (c) the structural inadequacies in institutions and organizations have created bottlenecks in the reform process; and (d) uncertainty in the political sphere has contributed further to policy shifts.  相似文献   

9.
This paper introduces the model confidence set (MCS) and applies it to the selection of models. A MCS is a set of models that is constructed such that it will contain the best model with a given level of confidence. The MCS is in this sense analogous to a confidence interval for a parameter. The MCS acknowledges the limitations of the data, such that uninformative data yield a MCS with many models, whereas informative data yield a MCS with only a few models. The MCS procedure does not assume that a particular model is the true model; in fact, the MCS procedure can be used to compare more general objects, beyond the comparison of models. We apply the MCS procedure to two empirical problems. First, we revisit the inflation forecasting problem posed by Stock and Watson (1999), and compute the MCS for their set of inflation forecasts. Second, we compare a number of Taylor rule regressions and determine the MCS of the best regression in terms of in‐sample likelihood criteria.  相似文献   

10.
Spatial and/or temporal clustering of pathogens will invalidate the commonly used assumption of Poisson‐distributed pathogen counts (doses) in quantitative microbial risk assessment. In this work, the theoretically predicted effect of spatial clustering in conventional “single‐hit” dose‐response models is investigated by employing the stuttering Poisson distribution, a very general family of count distributions that naturally models pathogen clustering and contains the Poisson and negative binomial distributions as special cases. The analysis is facilitated by formulating the dose‐response models in terms of probability generating functions. It is shown formally that the theoretical single‐hit risk obtained with a stuttering Poisson distribution is lower than that obtained with a Poisson distribution, assuming identical mean doses. A similar result holds for mixed Poisson distributions. Numerical examples indicate that the theoretical single‐hit risk is fairly insensitive to moderate clustering, though the effect tends to be more pronounced for low mean doses. Furthermore, using Jensen's inequality, an upper bound on risk is derived that tends to better approximate the exact theoretical single‐hit risk for highly overdispersed dose distributions. The bound holds with any dose distribution (characterized by its mean and zero inflation index) and any conditional dose‐response model that is concave in the dose variable. Its application is exemplified with published data from Norovirus feeding trials, for which some of the administered doses were prepared from an inoculum of aggregated viruses. The potential implications of clustering for dose‐response assessment as well as practical risk characterization are discussed.  相似文献   

11.
In the 1970s, large increases in the price of oil were associated with sharp decreases in output and large increases in inflation. In the 2000s, even larger increases in the price of oil were associated with much milder movements in output and inflation. Using a structural VAR approach, Blanchard and Gali (in J. Gali and M. Gertler (eds.) 2009, International Dimensions of Monetary Policy, University of Chicago Press, pp. 373–428) argued that this reflected a change in the causal relation from the price of oil to output and inflation. They then argued that this change could be due to a combination of three factors: a smaller share of oil in production and consumption, lower real wage rigidity, and better monetary policy. Their argument, based on simulations of a simple new‐Keynesian model, was informal. Our purpose in this paper is to take the next step, and to estimate the explanatory power and contribution of each of these factors. To do so, we use a minimum distance estimator that minimizes, over the set of structural parameters and for each of two samples (pre‐ and post‐1984), the distance between the empirical SVAR‐based impulse response functions and those implied by a new‐Keynesian model. Our empirical results point to an important role for all three factors.  相似文献   

12.
In the map verification problem, a robot is given a (possibly incorrect) map M of the world G with its position and orientation indicated on the map. The task is to find out whether this map, for the given robot position and its orientation in the map, is correct for the world G. We consider the world model of a graph G = (V G, E G) in which, for each vertex, edges incident to the vertex are ordered cyclically around that vertex. (This also holds for the map M = (V M, E M.) The robot can traverse edges and enumerate edges incident on the current vertex, but it cannot distinguish vertices (and edges) from each other. To solve the verification problem, the robot uses a portable edge marker, that it can put down at an edge of the graph world G and pick up later as needed. The robot can recognize the edge marker when it encounters it in the world G. By reducing the verification problem to an exploration problem, verification can be completed in O(|V G| × |E G|) edge traversals (the mechanical cost) with the help of a single vertex marker which can be dropped and picked up at vertices of the graph world (G. Dudek, M. Jenkin, E. Milios, and D. Wilkes, IEEE Trans. on Robotics and Automation, vol. 7, pp. 859–865, 1991; Robotics and Autonomous Systems, vol. 22(2), pp. 159–178, 1997). In this paper, we show a strategy that verifies a map in O(|V M|) edge traversals only, using a single edge marker, when M is a plane embedded graph, even though G may not be planar (e.g., G may contain overpasses, tunnels, etc.).  相似文献   

13.
Abstract

We addressed the inclusion of behavioral analyses in the research and case study articles published in the Journal of Organizational Behavior Management (JOBM) over the past 5 years (1992–1997). The amount of behavior analysis included in JOBM articles appears to be greater than that found in JABA in the early 1980s. However, the presence of such analyses in JOBM can still be increased. Further, a significant number of articles do not mention how the particular intervention was chosen to address a specific organizational problem. This lack of a functional assessment makes it difficult for readers to decide if a proposed intervention is applicable to a situation with which they might be dealing. Additionally, we reviewed articles and examined the range of those behavioral principles discussed and found little diversity among the principles used. This paper serves to address some of the problems and ramifications associated with this separation and to offer suggestions as to how this situation could be improved.  相似文献   

14.
Leptospirosis is a preeminent zoonotic disease concentrated in tropical areas, and prevalent in both industrialized and rural settings. Dose‐response models were generated from 22 data sets reported in 10 different studies. All of the selected studies used rodent subjects, primarily hamsters, with the predominant endpoint as mortality with the challenge strain administered intraperitoneally. Dose‐response models based on a single evaluation postinfection displayed median lethal dose (LD50) estimates that ranged between 1 and 107 leptospirae depending upon the strain's virulence and the period elapsed since the initial exposure inoculation. Twelve of the 22 data sets measured the number of affected subjects daily over an extended period, so dose‐response models with time‐dependent parameters were estimated. Pooling between data sets produced seven common dose‐response models and one time‐dependent model. These pooled common models had data sets with different test subject hosts, and between disparate leptospiral strains tested on identical hosts. Comparative modeling was done with parallel tests to test the effects of a single different variable of either strain or test host and quantify the difference by calculating a dose multiplication factor. Statistical pooling implies that the mechanistic processes of leptospirosis can be represented by the same dose‐response model for different experimental infection tests even though they may involve different host species, routes, and leptospiral strains, although the cause of this pathophysiological phenomenon has not yet been identified.  相似文献   

15.
Much of our understanding of competitive advantage draws upon the experience of Western firms. Massive Japanese investment in an effort to replicate keiretsu (interfirm) networks in Asia since the 1980s presents fertile grounds to shed new light on the sources of competitive advantage. Building on such an experience, this article develops a multilevel perspective focusing on how competitive advantage is preserved and strengthened for firms, networks, and nations involved. Its hallmark is careful attention to levels of analysis by (a) spelling out the attendant assumption of homogeneity among keiretsu member firms, (b) explaining the basis of such an assumption, (c) exploring alternative assumptions, and (d) drawing upon diverse subtopics within the strategy literature.  相似文献   

16.
We characterize, in the Anscombe–Aumann framework, the preferences for which there are a utility functionu on outcomes and an ambiguity indexc on the set of probabilities on the states of the world such that, for all acts f and g, . The function u represents the decision maker's risk attitudes, while the index c captures his ambiguity attitudes. These preferences include the multiple priors preferences of Gilboa and Schmeidler and the multiplier preferences of Hansen and Sargent. This provides a rigorous decision‐theoretic foundation for the latter model, which has been widely used in macroeconomics and finance.  相似文献   

17.
This paper examines the stability of the wage inflation process in Britain and America from 1892 to 1991. Utilizing a simple model of the aggregate labor market that treats wage inflation and the annual change in the unemployment rate as jointly endogenous variables, we find no evidence of substantial parameter shifts. In particular, there is no evidence of a secular increase in wage rigidity in either country nor is there support for the notion that periods of persistently high unemployment, such as the 1930s or the 1980s, are characterized by significant increases in wage rigidity. Our results are consistent with the findings of several other studies and indicate that wage rigidity has been a stable characteristic of labor markets since the end of the 19th century. Our results also suggest that the unemployment performance of these two countries over time can be explained by the interaction of demand shocks and wage rigidity.  相似文献   

18.
This paper proposes a new framework for determining whether a given relationship is nonlinear, what the nonlinearity looks like, and whether it is adequately described by a particular parametric model. The paper studies a regression or forecasting model of the form yt=μ( x t)+εt where the functional form of μ(⋅) is unknown. We propose viewing μ(⋅) itself as the outcome of a random process. The paper introduces a new stationary random field m(⋅) that generalizes finite‐differenced Brownian motion to a vector field and whose realizations could represent a broad class of possible forms for μ(⋅). We view the parameters that characterize the relation between a given realization of m(⋅) and the particular value of μ(⋅) for a given sample as population parameters to be estimated by maximum likelihood or Bayesian methods. We show that the resulting inference about the functional relation also yields consistent estimates for a broad class of deterministic functions μ(⋅). The paper further develops a new test of the null hypothesis of linearity based on the Lagrange multiplier principle and small‐sample confidence intervals based on numerical Bayesian methods. An empirical application suggests that properly accounting for the nonlinearity of the inflation‐unemployment trade‐off may explain the previously reported uneven empirical success of the Phillips Curve.  相似文献   

19.
We are moving rapidly into an age of transnational manufacturing, where things made in one country are shipped across national borders for further work, storage, sales, repair, remanufacture, recycle, or disposal; but our laws, policies, and management practices are slow in adjusting to this reality. They are often based on inaccurate premises. This article examines these premises and suggests what they imply for management of manufacturing. First, a common view is that manufacturing investment in the industrialized nations is declining and shifting to the developing countries. This is not true. Investment in manufacturing in both industrialized and developing nations is increasing and, in absolute value, there is a lot more investment in industrialized countries than in developing countries. Second, a related view argued by many is that manufacturing does not have a bright future in the rich countries. I argue that manufacturers can thrive in the industrialized countries if they learn how to add more value for the end users. They must go beyond productivity improvement to producing more technologically advanced and customized products, responding faster to changing customer demands, and appending more services to their products. Doing all this is easier in the industrialized countries because the needed skills and infrastructure are more readily available there. Third, another potentially misleading notion is related to why companies invest in manufacturing abroad. Access to low-cost production is not the main motivation in most cases; rather it is access to market. Superior global manufacturers use their foreign factories for much more: to serve their worldwide customers better, preempt competitors, work with sophisticated suppliers, collect critical marketing, technological, and competitive intelligence, and attract talented individuals into the company. They build integrated global production networks, not collections of disjointed factories that are spread internationally. Thus their investment in manufacturing abroad is not a substitute for investment at home, it is a complement. Building and managing such integrated global factor networks is the next challenge in manufacturing.  相似文献   

20.
Environmental tobacco smoke (ETS)has recently been determined by U.S. environmental and occupational health authorities to be a human carcinogen. We develop a model which permits using atmospheric nicotine measurements to estimate nonsmokers’ETS lung cancer risks in individual workplaces for the first time. We estimate that during the 1980s, the U.S. nonsmoking adult population's median nicotine lung exposure (homes and workplaces combined)was 143 micrograms (μg)of nicotine daily, and that most-exposed adult nonsmokers inhaled 1430 μg/day. These exposure estimates are validated by pharmacokinetic modeling which yields the corresponding steady-state dose of the nicotine metabolite, cotinine. For U.S. adult nonsmokers of working age, we estimate median cotinine values of about 1.0 nanogram per milliliter (ng/ml)in plasma, and 6.2 ng/ml in urine; for most-exposed nonsmokers, we estimate cotinine concentrations of about 10 ng/ml in plasma and 62 ng/ml in urine. These values are consistent to within 15% of the cotinine values observed in contemporaneous clinical epidemiological studies. Corresponding median risk from ETS exposure in U.S. nonsmokers during the 1980s is estimated at about two lung cancer deaths (LCDs)per 1000 at risk, and for most-exposed nonsmokers, about two LCDs per 100. Risks abroad appear similar. Modeling of the lung cancer mortality risk from passive smoking suggests that de minimis [i.e., “acceptable” (10-6)], risk occurs at an 8-hr time-weighted-average exposure concentration of 7.5 nanograms of ETS nicotine per cubic meter of workplace air for a working lifetime of 40 years. This model is based upon a linear exposure-response relationship validated by physical, clinical, and epidemiological data. From available data, it appears that workplaces without effective smoking policies considerably exceed this de minimis risk standard. For a substantial fraction of the 59 million nonsmoking workers in the U.S., current workplace exposure to ETS also appears to pose risks exceeding the de manifestos risk level above which carcinogens are strictly regulated by the federal government.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号