首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 874 毫秒
1.
Epileptic seizures are manifestations of intermittent spatiotemporal transitions of the human brain from chaos to order. Measures of chaos, namely maximum Lyapunov exponents (STL max ), from dynamical analysis of the electroencephalograms (EEGs) at critical sites of the epileptic brain, progressively converge (diverge) before (after) epileptic seizures, a phenomenon that has been called dynamical synchronization (desynchronization). This dynamical synchronization/desynchronization has already constituted the basis for the design and development of systems for long-term (tens of minutes), on-line, prospective prediction of epileptic seizures. Also, the criterion for the changes in the time constants of the observed synchronization/desynchronization at seizure points has been used to show resetting of the epileptic brain in patients with temporal lobe epilepsy (TLE), a phenomenon that implicates a possible homeostatic role for the seizures themselves to restore normal brain activity. In this paper, we introduce a new criterion to measure this resetting that utilizes changes in the level of observed synchronization/desynchronization. We compare this criterion’s sensitivity of resetting with the old one based on the time constants of the observed synchronization/desynchronization. Next, we test the robustness of the resetting phenomena in terms of the utilized measures of EEG dynamics by a comparative study involving STL max , a measure of phase (φ max ) and a measure of energy (E) using both criteria (i.e. the level and time constants of the observed synchronization/desynchronization). The measures are estimated from intracranial electroencephalographic (iEEG) recordings with subdural and depth electrodes from two patients with focal temporal lobe epilepsy and a total of 43 seizures. Techniques from optimization theory, in particular quadratic bivalent programming, are applied to optimize the performance of the three measures in detecting preictal entrainment. It is shown that using either of the two resetting criteria, and for all three dynamical measures, dynamical resetting at seizures occurs with a significantly higher probability (α=0.05) than resetting at randomly selected non-seizure points in days of EEG recordings per patient. It is also shown that dynamical resetting at seizures using time constants of STL max synchronization/desynchronization occurs with a higher probability than using the other synchronization measures, whereas dynamical resetting at seizures using the level of synchronization/desynchronization criterion is detected with similar probability using any of the three measures of synchronization. These findings show the robustness of seizure resetting with respect to measures of EEG dynamics and criteria of resetting utilized, and the critical role it might play in further elucidation of ictogenesis, as well as in the development of novel treatments for epilepsy.  相似文献   

2.
In an effort to understand basic functional mechanisms that can produce epileptic seizures, we introduce some key features in a model of coupled neural populations that enable the generation of seizure-like events and similar dynamics with the ones observed during the route of the epileptic brain towards real seizures. In this model, modified from David and Friston’s neural mass model, an internal feedback mechanism is incorporated to maintain synchronous behavior within normal levels despite elevated coupling. Normal internal feedback quickly regulates an abnormally high coupling between the neural populations, whereas pathological internal feedback can lead to hypersynchronization and the appearance of seizure-like high amplitude oscillations. Feedback decoupling is introduced as a robust seizure control strategy. An external feedback decoupling controller is introduced to maintain normal synchronous behavior. The results from the analysis in this model have an interesting physical interpretation and specific implications for the treatment of epileptic seizures. The proposed model and control scheme are consistent with a variety of recent observations in the human and animal epileptic brain, and with theories from nonlinear systems, adaptive systems, optimization, and neurophysiology.  相似文献   

3.
Epilepsy is a brain disorder characterized clinically by temporary but recurrent disturbances of brain function that may or may not be associated with destruction or loss of consciousness and abnormal behavior. Human brain is composed of more than 10 to the power 10 neurons, each of which receives electrical impulses known as action potentials from others neurons via synapses and sends electrical impulses via a sing output line to a similar (the axon) number of neurons. When neuronal networks are active, they produced a change in voltage potential, which can be captured by an electroencephalogram (EEG). The EEG recordings represent the time series that match up to neurological activity as a function of time. By analyzing the EEG recordings, we sought to evaluate the degree of underlining dynamical complexity prior to progression of seizure onset. Through the utilization of the dynamical measurements, it is possible to classify the state of the brain according to the underlying dynamical properties of EEG recordings. The results from two patients with temporal lobe epilepsy (TLE), the degree of complexity start converging to lower value prior to the epileptic seizures was observed from epileptic regions as well as non-epileptic regions. The dynamical measurements appear to reflect the changes of EEG’s dynamical structure. We suggest that the nonlinear dynamical analysis can provide a useful information for detecting relative changes in brain dynamics, which cannot be detected by conventional linear analysis.  相似文献   

4.
Fundamental problems in data mining mainly involve discrete decisions based on numerical analyses of data (e.g., class assignment, feature selection, data categorization, identifying outlier samples). These decision-making problems in data mining are combinatorial in nature and can naturally be formulated as discrete optimization problems. One of the most widely studied problems in data mining is clustering. In this paper, we propose a new optimization model for hierarchical clustering based on quadratic programming and later show that this model is compact and scalable. Application of this clustering technique in epilepsy, the second most common brain disorder, is a case point in this study. In our empirical study, we will apply the proposed clustering technique to treatment problems in epilepsy through the brain dynamics analysis of electroencephalogram (EEG) recordings. This study is a proof of concept of our hypothesis that epileptic brains tend to be more synchronized (clustered) during the period before a seizure than a normal period. The results of this study suggest that data mining research might be able to revolutionize current diagnosis and treatment of epilepsy as well as give a greater understanding of brain functions (and other complex systems) from a system perspective. This work was partially supported by the NSF grant CCF 0546574 and Rutgers Research Council grant-202018.  相似文献   

5.
The management of natural hazards occurring over a territory entails two main phases: a preoperational —or pre-event—phase, whose objective is to relocate resources closer to sites characterized by the highest hazard, and an operational —during the event—phase, whose objective is to manage in real time the available resources by allocating them to sites where their intervention is needed. Obviously, the two phases are closely related, and demand a unified and integrated treatment. This work presents a unifying framework that integrates various decisional problems arising in the management of different kinds of natural hazards. The proposed approach, which is based on a mathematical programming formulation, can support the decisionmakers in the optimal resource allocation before (preoperational phase) and during (operational phase) an emergency due to natural hazard events. Different alternatives of modeling the resources and the territory are proposed and discussed according to their appropriateness in the preoperational and operational phases. The proposed approach can be applied to the management of any natural hazard and, from an integration perspective, may be particularly useful for risk management in civil protection operations. An application related to the management of wildfire hazard is presented.  相似文献   

6.
7.
We extend Implicit Leadership Theory, which addresses criteria that individuals use to identify leaders, by examining whether the predictors of leadership emergence change over time. Building on leader-distance research, we predict that time influences the traits on which individuals base their selection of others as leaders: Initially, before individuals have had many opportunities to interact, and distance between them is high, they select leaders according to easily-noticeable physical and psychological traits; however, with time, as distance decreases, they rely on more covert psychological traits. We carried out a three-day field study in an intensive workshop for individuals entering an executive-MBA program (n = 64). Data were gathered from participants at four points in time. We found that the criteria by which people nominate leaders change over time from easily-noticeable traits (facial attractiveness, gender, extraversion) to more covert personality traits (conscientiousness).  相似文献   

8.
In the United States, insurance against flood hazard (inland flooding or storm surge from hurricanes) has been provided mainly through the National Flood Insurance Program (NFIP) since 1968. The NFIP covers $1.23 trillion of assets today. This article provides the first analysis of flood insurance tenure ever undertaken: that is, the number of years that people keep their flood insurance policy before letting it lapse. Our analysis of the entire portfolio of the NFIP over the period 2001-2009 reveals that the median tenure of new policies during that time is between two and four years; it is also relatively stable over time and levels of flood hazard. Prior flood experience can affect tenure: people who have experienced small flood claims tend to hold onto their insurance longer; people who have experienced large flood claims tend to let their insurance lapse sooner. To overcome the policy and governance challenges posed by homeowners' inadequate insurance coverage, we discuss policy recommendations that include for banks and government-sponsored enterprises (GSEs) strengthening their requirements and the introduction of multiyear flood insurance contracts attached to the property, both of which are likely to provide more coverage stability and encourage investments in risk-reduction measures.  相似文献   

9.
本文以Ohlson模型考察2002-2007年A股亏损公司定价问题。针对亏损公司盈余与权益价值负相关这一异常现象,我们通过将政府补贴、成长性、研发支出和负债融资信息引入定价模型,发现政府补贴、成长性和负债因素能够有效改善模型,消除了盈余与权益价值显著负相关的异常现象,改善了定价模型的效果。通过从规模和净利润两个维度对亏损公司的进一步分类,我们发现从净利润维度来看,政府向那些更容易通过非经常项目扭亏的公司提供了更多财政补贴,并且政府补贴与净利润为正公司的权益价值显著负相关,与净利润为负公司的权益价值不存在显著相关关系;从规模维度来看,上述现象在小公司样本更加明显。  相似文献   

10.
In this study, we consider a supplier's contract offerings to a buyer who may obtain improved forecasts for her demand over time. We investigate how the supplier can take advantage of the buyer's better forecasts and what kind of contracts he should offer to the buyer in order to maximize his profits. We model a natural forecast evolution where the buyer can obtain a more accurate forecast closer to the selling season. We assume there is information asymmetry between the buyer and the supplier at all times in that the buyer understands her demand better than the supplier. Three types of contracts that the supplier can offer are considered: (1) one where a contract is offered before the buyer has a chance to obtain improved forecasts, (2) one where a contract is offered after the buyer has obtained improved forecasts, and (3) a contingent (dynamic) contract which offers an initial contract to the buyer before she obtains improved forecasts, followed by a later contract (contingent on the initial contract) offered after improved forecasts have been obtained. We consider two scenarios: (1) where the supplier is certain that the buyer can obtain more accurate forecasts over time, and (2) where the supplier is uncertain about the buyer's forecasting capability (or forecasting cost). In the first scenario, we show that among the three types of contracts, the contingent contract is always the most profitable for the supplier. Furthermore, using the contingent contract, the supplier always benefits from higher accuracy of the buyer's demand forecasts. In the second scenario, we explicitly model the supplier's level of certainty about the buyer's capability of obtaining better forecasts, and explore how the supplier can design contracts to induce the buyer to obtain better forecasts when she is capable.  相似文献   

11.
Because of the changing competitive environment, quality might have lost some of its luster and emphasis in business. The research question we aim to address in this paper is: Does quality still pay in the new competitive environment? Using replication research, we re‐examine the impact of an effective total quality management (TQM) program on a firm's operating performance in the new competitive environment. We use publicly available data for award‐winning firms and adopt several control‐firm‐selection approaches in our event study. Based on data from more than 500 firms, we find that over a 10‐year period—6 years before to 3 years after winning their first quality award—firms in our sample perform significantly better than control groups in various operating performance measures. Not only do award‐winning firms have better results after receiving awards, they also have superior performance records before the award. Our results suggest that quality is still critical to achieving long‐term competitive advantages, and firms who continuously improve their quality continue to reap rewards by way of sales and financial performances exceeding those of their competitors.  相似文献   

12.
Customer integration has become an established topic in management research with specific attention given by marketing researchers. The term refers to a new definition of a customer’s role in market exchange. While previously the customer was considered a merely passive recipient of goods and services only he now takes over the function of active supplier of input before, while and after a market transaction. Currently, four different research lines on the topic can be identified: Each of these research lines also mirrors a distinct manifestation of practical application: (1) the business of solutions, (2) mass customization, (3) service co-creation and (4) value co-creation. For the future we expect more stimulation for both the practical and the conceptual advancement of customer integration particularly to stem from new technologies to be summarized under the label of web 2.0. For web 2.0 we distinguish between moderator centered approaches such as crowdsourcing, swarm creativity and open innovation on the one hand and community centered approaches such as social network sites on the other hand. In order to advance and substantiate our understanding of the new implications from web 2.0 on customer integration we propose to resort to the theory of social capital and the concept of the borderless organization.  相似文献   

13.
Bruno Decreuse 《LABOUR》2002,16(4):609-633
Should we cut the level of unemployment benefits, or reduce their potential duration? The answer depends on the way the unemployed search behaviour and unemployment insurance schemes interact. In this paper, we consider that unemployment insurance funds can be used to improve search. Resulting hazards are increasing over the unemployment spell prior to the exhaustion of benefits, and plummet immediately after it. Turning to policy implications, we assume the public decision–maker aims to minimize the average duration of unemployment under a resource constraint. First, we show the stationary relationship between average unemployment duration and unemployment benefit is hump–shaped. Second, raising benefits over a short duration can reduce average duration. Finally, we demonstrate that most of the time, a declining (yet always positive) benefit scheme is optimal.  相似文献   

14.
本文利用抽样调查数据,从乡镇财政的收入结构(预算内和预算外)、支出结构(预算内和预算外)、上解和补助、财政体制等方面,对税费改革乡镇财力的前后情况进行比较。研究表明,税费改革后,尽管县与乡镇之间的上解和补助都有所增加,但从总体上看,县对乡镇的财政控制高于改革前,乡镇对自身的财政控制能力越来越弱。同时研究还表明,同改革前的2000年相比较,2004年的乡镇财政状况进一步恶化。另外,税费改革对不同省份和地区的影响是不同的,相对于贫困地区,对富裕地区的不利影响更严重。  相似文献   

15.
For years we have been hearing that US automobile manufacturers have been losing market share to their Japanese rivals who are reputed to make better quality vehicles. Most such reports are based on the initial quality surveys on new automobiles. In this paper we address two exploratory questions: (1) how does the quality of an automobile change with its age, and, (2) can firm level variables help explain differences quality. To answer these questions, we collected Consumer Reports’ reliability ratings on approximately 300 automobile models made by European, Japanese and US automotive firms during the 1998–2007; and approximately 240 models made by these firms over period of 2008–2015. For both periods we found that not only do automobiles made by Japanese firms have higher initial quality, but, as automobiles get older the difference in the product quality between Japanese versus European and US firms increases. We also found that the more generalist a European or US automobile firm, i.e., the wider is the firm׳s product offering in the marketplace, the lower its overall automobile quality during the 1998–2007 period. Conversely, Japanese generalist firms were found to have higher quality than specialist firms over the same period. The result is partly explained by the fact that Japanese firms have taken a different path to broadening their product variety – they have ensured a high level of quality of their initial offerings before entering newer market segments. The rate of reliability decline was found to be slower for all firms, and the differences in reliability across the 3 groups of firms were much less pronounced during the 2008–2015 period. This improvement may be as result of restructuring done by US automobile firms.  相似文献   

16.
本文借鉴LLSv的掏空模型,模型化推导了控股股东对上市公司的隐蔽掏空模式,建立了控股股东持股比例与掏空程度的分段函数关系.以2004年我国沪深A股上市公司对子公司担保的386起事件为样本,并把样本分为过度担保组和适度担保组,运用事件研究法进行实证检验.发现过度担保样本组累计超额收益率显著为负,而适度担保组的市场反应为正但不显著,由此推测过度担保的上市公司具有向控股股东输送利益的倾向.多元回归结果表明,对于第一大股东持股比例和国家持股比例的回归系数而言,60%均是其有效的临界点.当大股东持股比例低于60%时,大股东存在通过上市公司对子公司担保的方式掏空上市公司的现象,但回归系数不显著;当持股比例高于60%时,则产生了显著的利益协调效应.能有效地抑制掏空.通过分类变量和股权临界变量的多种组合,回归发现,国有性质的上市公司被控股股东掏空程度更高;与第一大股东持股比例变量相比.国家持股比例变量的股权临界值对CAR的影响更加显著.  相似文献   

17.
This paper seeks to provide empirical evidence on the efficacy of three important governance mechanisms (auditors, directors, and institutional shareholders) in constraining aggressive financial reporting, proxied by abnormal accruals. It also examines the effects of the Sarbanes–Oxley Act (SOX) on their efficacy. Using a sample of US firms audited by the Big 5 (4) auditors between 2000 and 2004, we document a positive relation between abnormal accruals (our proxy for financial reporting aggressiveness) and auditors’ economic dependence on their clients. Furthermore, we find that this relation is driven by firms with weak non-auditor governance mechanisms before and after the enactment of SOX. The results suggest that aggressive financial reporting occurs only when multiple governance mechanisms ‘fail’. More specifically, such type of reporting requires that a highly dependent auditor operates in a ‘poor’ governance setting. Thus, the paper underscores the importance of strong governance in constraining aggressive financial reporting. Moreover, our results suggest that governance regulation (such as SOX) is not a substitute for strong governance mechanisms and thus caution against the over reliance on SOX-type legislation in other parts of the world.  相似文献   

18.
Leaders, followers, and time   总被引:5,自引:4,他引:1  
In order to consider leadership from a temporal perspective, we examine extant leadership research that refers to temporal variables in its theorizing and/or empirical testing. We consider rhythmic patterns manifested in leader and follower behavior and employ entrainment, polychronicity, pace/speed, punctuality, and temporal depth as categorization concepts for the analysis. Further, we propose general theoretical statements about temporal dimensions and their prospective roles in relationships and processes related to leadership.  相似文献   

19.
In this work, we evaluate eight exchange traded funds (ETFs) and their benchmark index (the KOSPI 200 Index), based on the Sharpe ratio and the Treynor ratio and find that the performance of these well-diversified portfolios are quite poor relative to individual stocks. Investors׳ preference to avoid the well-diversified portfolios would be related to this poor performance. However, we empirically show that ETFs and the KOSPI 200 Index are the most efficient investment instruments with respect to the new performance measure designed on the basis of the data envelopment analysis (DEA) methodology. Examining the panel data over the period between 2003 and 2014 indicates that well-diversified portfolios improve the efficiency by adjusting the input variables (σ and β). Furthermore, they do so more effectively as they mature.  相似文献   

20.
In Britain, it is recommended that, to stay healthy, adults should do 150 minutes of moderate‐intensity physical activity every week. The recommendations provided by the U.K. government, however, remain silent in regard to the type of activity that should be done. Using the annual Health Survey for England we compare how different types of physical activities predict a person's weight. In particular, we consider clinically measured body mass index and waist circumference. We document mean slopes emanating from ordinary least squares regressions with these measures as the dependent variables. We show that individuals who walk at a brisk or fast pace are more likely to have a lower weight when compared to individuals doing other activities. Additionally, we highlight that the association between physical activity and weight is stronger for females and individuals over the age of 50. Our overall conclusions are robust to a number of specifications.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号