首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We survey different models, techniques, and some recent results to tackle machine scheduling problems within a distributed setting. In traditional optimization, a central authority is asked to solve a (computationally hard) optimization problem. In contrast, in distributed settings there are several agents, possibly equipped with private information that is not publicly known, and these agents must interact to derive a solution to the problem. Usually the agents have their individual preferences, which induces them to behave strategically to manipulate the resulting solution. Nevertheless, one is often interested in the global performance of such systems. The analysis of such distributed settings requires techniques from classical optimization, game theory, and economic theory. The paper therefore briefly introduces the most important of the underlying concepts and gives a selection of typical research questions and recent results, focusing on applications to machine scheduling problems. This includes the study of the so‐called price of anarchy for settings where the agents do not possess private information, as well as the design and analysis of (truthful) mechanisms in settings where the agents do possess private information.  相似文献   

2.
Abstract

The goals of this study were to explore existing practices used to facilitate knowledge transfer in MNCs in the information technology industry in China, and to investigate the frequency of use and the influence of each practice on knowledge transfer. In addition, this study examined which types of knowledge could be transferred by a given practice. The study used a two-stage method that included two surveys. The first survey was designed to develop an inclusive list of knowledge transfer practices. Thirty-three practices were identified in the first survey. The second survey found: 1) the frequency of use and the influence on knowledge transfer varied from one practice to another; 2) MNCs frequently used those practices with higher influence on knowledge transfer; 3) certain practices are better suited in transferring certain types of knowledge.  相似文献   

3.
Massive efforts are underway to clean up hazardous and radioactive waste sites located throughout the United States. To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate, and effects of hazardous chemicals and radioactive materials found at these sites. Although the U.S. Environmental Protection Agency (EPA), the U.S. Department of Energy (DOE), and the U.S. Nuclear Regulatory Commission (NRC)have provided preliminary guidance to promote the use of computer models for remediation purposes, no agency has produced directed guidance on models that must be used in these efforts. As a result, model selection is currently done on an ad hoc basis. This is administratively ineffective and costly, and can also result in technically inconsistent decision-making. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE, and NRC was initiated. The purpose of this project was to: (1)identify models being used for hazardous and radioactive waste site assessment purposes; and (2)describe and classify these models. This report presents the results of this study. A mail survey was conducted to identify models in use. The survey was sent to 550 persons engaged in the cleanup of hazardous and radioactive waste sites; 87 individuals responded. They represented organizations including federal agencies, national laboratories, and contractor organizations. The respondents identified 127 computer models that were being used to help support cleanup decision-making. There were a few models that appeared to be used across a large number of sites (e.g., RESRAD). In contrast, the survey results also suggested that most sites were using models which were not reported in use elsewhere. Information is presented on the types of models being used and the characteristics of the models in use. Also shown is a list of models available, but not identified in the survey itself.  相似文献   

4.
This article reports an assessment of the growing use of Internet-based public participation methods, e-participation, in planning practice and university-level planning education in the USA. After documenting results from case study reviews of practice and a web-based survey of planning faculty, a comparative analysis reveals that academic programs are incorporating a range of e-participation tools; however, there is a need to increase curricula content to mirror trends in planning practice. The article concludes with recommendations on how to build on the strengths and to address the weaknesses observed in this study to better prepare students for the demands of planning practice.  相似文献   

5.
This study extends prior research on supply chain planning and integration by examining the underlying capabilities by which firms exploit the information they gain from integration activities. We use organizational information processing theory (OIPT) to develop hypotheses that identify the comprehensiveness of an organization's supply chain planning capabilities as an important mediator in the relationship between its supply chain integration activities and its operational performance. Further, our interpretation of OIPT suggests that an organization's usage of technology‐enabled supply chain management systems (SCMS) moderates these effects. Using survey data from 445 global firms, we estimate the corresponding moderated‐mediation structural model. The results indicate that usage of SCMS enables organizations to better utilize the information they gain from external integration efforts (relationships with customers and suppliers), thus improving the comprehensiveness of their supply chain planning capabilities. In contrast, the use of SCMS appears to be a partial substitute for internal integration as a driver of planning comprehensiveness. Most importantly, the results suggest that planning comprehensiveness is a significant generative means by which integration and technology investments produce superior operational performance. These findings provide a richer and more theoretically grounded explanation of relationships between supply chain integration, supply chain planning, and operational performance.  相似文献   

6.
The possibility of treason by a close associate has been a nightmare of most autocrats throughout history. More competent viziers are better able to discriminate among potential plotters, and this makes them more risky subordinates for the ruler. To avoid this, rulers, especially those who are weak and vulnerable, sacrifice the competence of their agents, hiring mediocre but loyal subordinates. Furthermore, any use of incentive schemes by a personalistic dictator is limited by the fact that all punishments are conditional on the dictator’s own survival. We endogenize loyalty and competence in a principal‐agent game between a dictator and his viziers in both static and dynamic settings. The dynamic model allows us to focus on the succession problem that insecure dictators face.  相似文献   

7.
Recent flood risk management puts an increasing emphasis on the public's risk perception and its preferences. It is now widely recognized that a better knowledge of the public's awareness and concern about risks is of vital importance to outline effective risk communication strategies. Models such as Risk Information Seeking and Processing address this evolution by considering the public's needs and its information-seeking behavior with regard to risk information. This study builds upon earlier information-seeking models and focuses on the empirical relationships between information-seeking behavior and the constructs of risk perception, perceived hazard knowledge, response efficacy, and information need in the context of coastal flood risks. Specific focus is given to the mediating role of information need in the model and to the differences in information-seeking behavior between permanent and temporary residents. By means of a structured on-line questionnaire, a cross-sectional survey was carried out in the city of Ostend, one of the most vulnerable places to coastal flooding on the Belgian coast. Three hundred thirteen respondents participated in the survey. Path analysis reveals that information need does not act as a mediator in contrast to risk perception and perceived knowledge. In addition, it is shown that risk perception and perceived hazard knowledge are higher for permanent than temporary residents, leading to increased information-seeking behavior among the former group. Implications for risk communication are discussed.  相似文献   

8.
On the one hand, physician executives are clinicians who place value on professional autonomy. As clinicians, the best interests of the patient drive their decision making and their value system. On the other hand, as managers, physician executives serve as agents of an organization. Because of the differences in the two cultures, some physicians have called the physician executive position a "no man's land" To address these issues and answer the questions that surround them, the authors developed a survey that was mailed to a random sample of the membership of the American College of Physician Executives. Parts of the survey served in other studies of role conflict and role ambiguity. Parts of the survey are new, developed specifically to analyze the physician executive role. The findings are reported in this article.  相似文献   

9.
Risks related to information technology (IT) are becoming a focus of concern in current risk debates. The use of IT is rapidly spreading in the population, as is access to computers in general. The present article reviews the literature on IT use (especially electronic mail and various Internet applications) and the related risks. In addition, the results from a survey about IT use and risk perception, given to a random sample of the Swedish population, are reported. In general, participants were quite positive to IT and were, to some extent, aware of the related risks. However, risks of IT were mostly seen as pertinent to other people, a finding in contrast with other results on perceived technology hazards. The attitude toward the use of IT was strongly related to general attitude toward computers, and less clearly to risk perception. Only a small percentage of the respondents reported having had negative experiences with IT hazards such as Internet addiction, depression, and social isolation. When extrapolated to the general population, however, these small percentages amount to large groups in the population that have been negatively affected by IT use.  相似文献   

10.
Nada R. Sanders  Karl B. Manrodt   《Omega》2003,31(6):511-522
In an era where forecasts drive entire supply chains forecasting is seen as an increasingly critical organizational capability. However, business forecasting continues to rely on judgmental methods despite large advancements in information technology and quantitative method capability, prompting calls for research to help understand the reasons behind this practice. Our study is designed to contribute to this knowledge by profiling differences between firms identified as primary users of either judgmental or quantitative forecasting methods. Relying on survey data from 240 firms we statistically analyzed differences between these categories of users based on a range of organizational and forecasting issues. Our study finds large differences in forecast error rates between the two groups, with users of quantitative methods significantly outperforming users of judgmental methods. The former are found to be equally prevalent regardless of industry, firm size, and product positioning strategy, documenting the benefits of quantitative method use in a variety of settings. By contrast, the latter are found to have significantly lower access to quantifiable data and to use information and technology to a lesser degree.  相似文献   

11.
《Risk analysis》2018,38(10):2128-2143
Subjective probabilities are central to risk assessment, decision making, and risk communication efforts. Surveys measuring probability judgments have traditionally used open‐ended response modes, asking participants to generate a response between 0% and 100%. A typical finding is the seemingly excessive use of 50%, perhaps as an expression of “I don't know.” In an online survey with a nationally representative sample of the Dutch population, we examined the effect of response modes on the use of 50% and other focal responses, predictive validity, and respondents’ survey evaluations. Respondents assessed the probability of dying, getting the flu, and experiencing other health‐related events. They were randomly assigned to a traditional open‐ended response mode, a visual linear scale ranging from 0% to 100%, or a version of that visual linear scale on which a magnifier emerged after clicking on it. We found that, compared to the open‐ended response mode, the visual linear and magnifier scale each reduced the use of 50%, 0%, and 100% responses, especially among respondents with low numeracy. Responses given with each response mode were valid, in terms of significant correlations with health behavior and outcomes. Where differences emerged, the visual scales seemed to have slightly better validity than the open‐ended response mode. Both high‐numerate and low‐numerate respondents’ evaluations of the surveys were highest for the visual linear scale. Our results have implications for subjective probability elicitation and survey design.  相似文献   

12.
The study presented in this paper aimed at checking whether companies that embed information and communication technologies (ICT)-enabled time performance into their product offering can achieve better economic outcomes from technology adoption. Indeed, it is still questionable whether technology adoption results in a superior profitability, especially if such an improvement is achieved through the use of functional ICT applications. In this study, we assume that a better alignment among ICT investments, improvements of the logistics process and the value proposition of the firm can lead towards a superior economic performance. A survey was conducted and 180 usable questionnaires were collected from companies engaged in the electronics and vehicle manufacturing industries. Data were analysed through the structural equation modeling approach. The results show that improvements in speed and dependability, achieved through technology adoption, can lead to a better economic performance if they are embedded into superior after-sales services and/or into an improved product offering.  相似文献   

13.
Empirical studies using survey data on expectations have frequently observed that forecasts are biased and have concluded that agents are not rational. We establish that existing rationality tests are not robust to even small deviations from symmetric loss and hence have little ability to tell whether the forecaster is irrational or the loss function is asymmetric. We quantify the trade‐off between forecast inefficiency and asymmetric loss leading to identical outcomes of standard rationality tests and explore new and more general methods for testing forecast rationality jointly with flexible families of loss functions that embed squared loss as a special case. Empirical applications to survey data on forecasts of real output growth and inflation suggest that rejections of rationality may largely have been driven by the assumption of squared loss. Moreover, our results suggest that agents are averse to “bad” outcomes such as lower‐than‐expected real output growth and higher‐than‐expected inflation and that they incorporate such loss aversion into their forecasts. (JEL: C22, C53, E37)  相似文献   

14.
Information systems researchers have often turned to a variant of the Delphi survey technique to support their research of key issues in their field. Two particular weaknesses of past studies using this approach have been a lack of a definitive method for conducting the research and a lack of statistical support for the conclusions drawn by the researchers. In this paper, the author presents a method, based on nonparametric statistical techniques, to conduct ranking-type Delphi surveys, perform analysis, and report results. The author takes this one step further by illustrating an actual analysis of a Delphi survey. The analysis is compared to results that were presented without the benefit of the author's approach. This paper shows that use of the advocated approach can streamline and strengthen studies, improve the validity of results, and thus better serve the consumers of the research findings. Since the ranking-type Delphi is so popular among information systems researchers, a consistent method is needed to apply to their data collection, analysis, and reporting of results. This paper provides such a method in concise form and illustrates the use of the method in a manner affording comparison between it and previous practice.  相似文献   

15.
16.
Approximation mechanism design without money was first studied in Procaccia and Tennenholtz (2009) by considering a facility location game. In general, a facility is being opened and the cost of an agent is measured by its distance to the facility. In order to achieve a good social cost, a mechanism selects the location of the facility based on the locations reported by agents. It motivates agents to strategically report their locations to get good outcomes for themselves. A mechanism is called strategyproof if no agents could manipulate to get a better outcome by telling lies regardless of any configuration of other agents. The main contribution in this paper is to explore the strategyproof mechanisms without money when agents are distinguishable. There are two main variations on the nature of agents. One is that agents prefer getting closer to the facility, while the other is that agents prefer being far away from the facility. We first consider the model that directly extends the model in Procaccia and Tennenholtz (2009). In particular, we consider the strategyproof mechanisms without money when agents are weighted. We show that the strategyproof mechanisms in the case of unweighted agents are still the best in the weighted cases. We establish tight lower and upper bounds for approximation ratios on the optimal social utility and the minimum utility when agents prefer to stay close to the facility. We then provide the lower and upper bounds on the optimal social utility and lower bound on the minimum distance per weight when agents prefer to stay far away from the facility. We also extend our study in a natural direction where two facilities must be built on a real line. Secondly, we propose an novel threshold based model to distinguish agents. In this model, we present a strategyproof mechanism that leads to optimal solutions in terms of social cost.  相似文献   

17.
Multinational companies use a wide range of mechanisms to keep control over a subsidiary abroad such as the share of capital in the case of international joint ventures, expatriation, active participation in the board of directors, staffing key management positions, training and socialization of employees, technology transfer, and so on. However, only a few empirical studies on the control of international subsidiaries embrace all these dimensions simultaneously and show how they interact. This paper presents the empirical results of a quantitative survey of 316 subsidiaries, international joint ventures and wholly foreign owned enterprises, set up in China by European and Japanese multinationals. The main objective of the survey is to bring out an inductive multidimensional model of control, and to allow a better understanding of complex interaction and balance between the instruments of control of a subsidiary abroad.  相似文献   

18.
Many queueing systems are subject to time-dependent changes in system parameters, such as the arrival rate or number of servers. Examples include time-dependent call volumes and agents at inbound call centers, time-varying air traffic at airports, time-dependent truck arrival rates at seaports, and cyclic message volumes in computer systems.There are several approaches for the performance analysis of queueing systems with deterministic parameter changes over time. In this survey, we develop a classification scheme that groups these approaches according to their underlying key ideas into (i) numerical and analytical solutions, (ii) approaches based on models with piecewise constant parameters, and (iii) approaches based on modified system characteristics. Additionally, we identify links between the different approaches and provide a survey of applications that are categorized into service, road and air traffic, and IT systems.  相似文献   

19.
Consequence models for the risk assessment of man-made or natural disasters do not ordinarily take into account time-of-day variations in the size of the exposed population. Residential census population statistics are used instead. This paper proposes and illustrates a methodology for using metropolitan travel survey data to estimate the variations in question. Variations are computed from the Washington, D.C. area sample survey statistics on the number of trips taken in and out of different census tracts throughout each workday. Four principal patterns of population variation are identified, corresponding to four types of land use: commercial, residential, shopping/entertainment, and mixed use. Some general implications for consequence analysis are discussed.  相似文献   

20.
Though use of the controversial precautionary principle in risk management has increasingly been recommended as a guide for the construction of public policy in Canada and elsewhere, there are few data available characterizing its use in risk management by senior public policymakers. Using established survey methodology we sought to investigate the perceptions and terms of application of the precautionary principle in this important subset of individuals. A total of 240 surveys were sent out to seven departments or agencies in the Canadian government. The overall survey response rate was 26.6%, and our findings need to be interpreted in the context of possible responder bias. Of respondents, the overwhelming majority perceived the precautionary principle and the management of risk as complementary, and endorsed a role for the precautionary principle as a general guideline for all risk management decisions. However, 25% of respondents responded that the lack of clarity of the definition of the principle was a limitation to its effective use. The majority of respondents viewed their own level of understanding of the precautionary principle as moderate. Risk managers appeared to favor an interpretation of the precautionary principle that was based on the seriousness and irreversibility of the threat of damage, and did not endorse as strongly the need for cost effectiveness in the measures taken as a precaution against such threats. In contrast with its perceived role as a general guideline, the application of the precautionary principle by respondents was highly variable, with >60% of respondents reporting using the precautionary principle in one-quarter or less of all risk management decisions. Several factors influenced whether the precautionary principle was applied with the perceived seriousness of the threat being considered the most influential factor. The overwhelming majority of risk managers felt that "preponderance of evidence" was the level of evidence required for precautionary action to be instituted against a serious negative event. Overall, the majority of respondents viewed the precautionary principle as having a significant and positive impact on risk management decisions. Importantly, respondents endorsed a net result of more good than harm to society when the precautionary principle was applied to the management of risk.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号