首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Kramer and Lee recently addressed a common due window scheduling problem with earliness and tardiness penalties, where earliness and tardiness penalty factors are constant and the common window size is given. They showed that the problem is polynomial when the location of the due window is a decision variable. For the case where the location of the due window is given, the problem is also polynomial when the latest due date is greater than or equal to the makespan, and they proposed a pseudopolynomial dynamic programming algorithm to find an optimal schedule when the latest due date is less than the makespan. In this note we address the problem for the case where the location of the due window is given. Specifically, we show that the problem is polynomial if the window location is unrestricted, and present a more efficient dynamic program algorithm to optimally solve the problem if the window location is restricted. The concepts of unrestricted and restricted window locations are defined in this note.  相似文献   

2.
We show in an environment of incomplete information that monotonicity and the Pareto property applied only when there is common knowledge of Pareto dominance imply (i) there must exist a common prior over the smallest common knowledge event, and (ii) aggregation must be ex ante and ex post utilitarian with respect to that common prior and individual von Neumann–Morgenstern utility indices.  相似文献   

3.
我国彩票市场的价格体系与供需分析   总被引:1,自引:0,他引:1  
为分析彩票的供需函数,建立了彩票市场的价格体系,该体系包括彩票的销售价格、预期价值和有效价格.基于此,以有效价格作为价格变量,研究了彩票的供给函数与需求函数.根据统计学的期望值原理,通过计算彩票的预期价值得到供给函数,彩票的供给曲线呈现先降后升的特点.通过回归分析估计彩票的需求函数,它是一个减函数.最后,选取我国双色球乐透型彩票进行了实证研究,估计出双色球彩票的供需函数,进行了彩票市场的均衡分析和弹性分析,并基于分析结果给出了我国发展彩票业的对策.  相似文献   

4.
We propose a novel statistic for conducting joint tests on all the structural parameters in instrumental variables regression. The statistic is straightforward to compute and equals a quadratic form of the score of the concentrated log–likelihood. It therefore attains its minimal value equal to zero at the maximum likelihood estimator. The statistic has a χ2 limiting distribution with a degrees of freedom parameter equal to the number of structural parameters. The limiting distribution does not depend on nuisance parameters. The statistic overcomes the deficiencies of the Anderson–Rubin statistic, whose limiting distribution has a degrees of freedom parameter equal to the number of instruments, and the likelihood based, Wald, likelihood ratio, and Lagrange multiplier statistics, whose limiting distributions depend on nuisance parameters. Size and power comparisons reveal that the statistic is a (asymptotic) size–corrected likelihood ratio statistic. We apply the statistic to the Angrist–Krueger (1991) data and find similar results as in Staiger and Stock (1997).  相似文献   

5.
It is sometimes argued that the use of increasingly complex "biologically-based" risk assessment (BBRA) models to capture increasing mechanistic understanding of carcinogenic processes may run into a practical barrier that cannot be overcome in the near term: the need for unrealistically large amounts of data about pharmacokinetic and pharmacodynamic parameters. This paper shows that, for a class of dynamical models widely used in biologically-based risk assessments, it is unnecessary to estimate the values of the individual parameters. Instead, the input-output properties of such a model–specifically, the ratio of the area-under-curve (AUC) for any selected output to the AUC of the input–is determined by a single aggregate "reduced" constant, which can be estimated from measured input and output quantities. Uncertainties about the many individual parameter values of the model, and even uncertainties about its internal structure, are irrelevant for purposes of quantifying and extrapolating its input-output (e.g., dose-response) behavior. We prove that this is the case for the class of linear, constant-coefficient, globally stable compartmental flow systems used in many classical pharmacokinetic and low-dose PBPK models. Examples are cited that suggest that the value of the reduced parameter representing such a system's aggregate behavior may be relatively insensitive to changes in (and hence to uncertainties about) the values of individual parameters. The theory is illustrated with a model of pharmacokinetics and metabolism of cyclophosphamide (CP), a drug widely used in cancer chemotherapy and as an immunosuppressive agent.  相似文献   

6.
We propose a generalized method of moments (GMM) Lagrange multiplier statistic, i.e., the K statistic, that uses a Jacobian estimator based on the continuous updating estimator that is asymptotically uncorrelated with the sample average of the moments. Its asymptotic χ2 distribution therefore holds under a wider set of circumstances, like weak instruments, than the standard full rank case for the expected Jacobian under which the asymptotic χ2 distributions of the traditional statistics are valid. The behavior of the K statistic can be spurious around inflection points and maxima of the objective function. This inadequacy is overcome by combining the K statistic with a statistic that tests the validity of the moment equations and by an extension of Moreira's (2003) conditional likelihood ratio statistic toward GMM. We conduct a power comparison to test for the risk aversion parameter in a stochastic discount factor model and construct its confidence set for observed consumption growth and asset return series.  相似文献   

7.
In Bayesian environments with private information, as described by the types of Harsanyi, how can types of agents be (statistically) disassociated from each other and how are such disassociations reflected in the agents' knowledge structure? Conditions studied are (i) subjective independence (the opponents' types are independent conditional on one's own) and (ii) type disassociation under common knowledge (the agents' types are independent, conditional on some common‐knowledge variable). Subjective independence is motivated by its implications in Bayesian games and in studies of equilibrium concepts. We find that a variable that disassociates types is more informative than any common‐knowledge variable. With three or more agents, conditions (i) and (ii) are equivalent. They also imply that any variable which is common knowledge to two agents is common knowledge to all, and imply the existence of a unique common‐knowledge variable that disassociates types, which is the one defined by Aumann.  相似文献   

8.
This paper investigates the effects of market size on the ability of price to aggregate traders' private information. To account for heterogeneity in correlation of trader values, a Gaussian model of double auction is introduced that departs from the standard information structure based on a common (fundamental) shock. The paper shows that markets are informationally efficient only if correlations of values coincide across all bidder pairs. As a result, with heterogeneously interdependent values, price informativeness may not increase monotonically with market size. As a necessary and sufficient condition for the monotonicity, price informativeness increases with the number of traders if the implied reduction in (the absolute value of) an average correlation statistic of an information structure is sufficiently small.  相似文献   

9.
The dose to human and nonhuman individuals inflicted by anthropogenic radiation is an important issue in international and domestic policy. The current paradigm for nonhuman populations asserts that if the dose to the maximally exposed individuals in a population is below a certain criterion (e.g., <10 mGy d(-1)) then the population is adequately protected. Currently, there is no consensus in the regulatory community as to the best statistical approach. Statistics, currently considered, include the maximum likelihood estimator for the 95th percentile of the sample mean and the sample maximum. Recently, the investigators have proposed the use of the maximum likelihood estimate of a very high quantile as an estimate of dose to the maximally exposed individual. In this study, we compare all of the above-mentioned statistics to an estimate based on extreme value theory. To determine and compare the bias and variance of these statistics, we use Monte Carlo simulation techniques, in a procedure similar to a parametric bootstrap. Our results show that a statistic based on extreme value theory has the least bias of those considered here, but requires reliable estimates of the population size. We recommend establishing the criterion based on what would be considered acceptable if only a small percentage of the population exceeded the limit, and hence recommend using the maximum likelihood estimator of a high quantile in the case that reliable estimates of the population size are not available.  相似文献   

10.
This paper uses a data base covering the universe of French firms for the period 1990–2007 to provide a forensic account of the role of individual firms in generating aggregate fluctuations. We set up a simple multisector model of heterogeneous firms selling to multiple markets to motivate a theoretically founded decomposition of firms' annual sales growth rate into different components. We find that the firm‐specific component contributes substantially to aggregate sales volatility, mattering about as much as the components capturing shocks that are common across firms within a sector or country. We then decompose the firm‐specific component to provide evidence on two mechanisms that generate aggregate fluctuations from microeconomic shocks highlighted in the recent literature: (i) when the firm size distribution is fat‐tailed, idiosyncratic shocks to large firms directly contribute to aggregate fluctuations, and (ii) aggregate fluctuations can arise from idiosyncratic shocks due to input–output linkages across the economy. Firm linkages are approximately three times as important as the direct effect of firm shocks in driving aggregate fluctuations.  相似文献   

11.
本文基于1990年1月至2008年12月间的月度数据,利用Logistic平滑迁移向量自回归(LSTVAR)模型来描述和检验我国货币政策作用机制中是否存在非对称效应。LSTVAR模型的估计和脉冲响应函数结果表明,我国实际产出序列和通货膨胀率过程对货币冲击的动态反应随着冲击方向、规模以及经济周期阶段的变化而改变,货币政策对实际产出和价格水平的作用具有非对称性,这说明总供给和总需求之间存在着非线性关系。  相似文献   

12.
Using the intuition that financial markets transfer risks in business time, “market microstructure invariance” is defined as the hypotheses that the distributions of risk transfers (“bets”) and transaction costs are constant across assets when measured per unit of business time. The invariance hypotheses imply that bet size and transaction costs have specific, empirically testable relationships to observable dollar volume and volatility. Portfolio transitions can be viewed as natural experiments for measuring transaction costs, and individual orders can be treated as proxies for bets. Empirical tests based on a data set of 400,000+ portfolio transition orders support the invariance hypotheses. The constants calibrated from structural estimation imply specific predictions for the arrival rate of bets (“market velocity”), the distribution of bet sizes, and transaction costs.  相似文献   

13.
In this paper, we build a model where the presence of liquidity constraints tends to magnify the economy's response to aggregate shocks. We consider a decentralized model of trade, where agents may use money or credit to buy goods. When agents do not have access to credit and the real value of money balances is low, agents are more likely to be liquidity constrained. This makes them more concerned about their short‐term earning prospects when making their consumption decisions and about their short‐term spending opportunities when making their production decisions. This generates a coordination element in spending and production which leads to greater aggregate volatility and greater comovement across producers.  相似文献   

14.
The mechanism design literature assumes too much common knowledge of the environment among the players and planner. We relax this assumption by studying mechanism design on richer type spaces. We ask when ex post implementation is equivalent to interim (or Bayesian) implementation for all possible type spaces. The equivalence holds in the case of separable environments; examples of separable environments arise (1) when the planner is implementing a social choice function (not correspondence) and (2) in a quasilinear environment with no restrictions on transfers. The equivalence fails in general, including in some quasilinear environments with budget balance. In private value environments, ex post implementation is equivalent to dominant strategies implementation. The private value versions of our results offer new insights into the relationship between dominant strategy implementation and Bayesian implementation.  相似文献   

15.
对高校知识工作者胜任力要素与个人业绩的关系进行了实证研究。结果表明,从高校知识工作者胜任力模型中提出的3族36项胜任力要素与个人业绩有紧密的正相关关系。该研究结果通过比较不同的胜任力要素与业绩相关强度的大小,为高校组织和高校知识工作者个人在实践中更加有目的、有步骤地提高胜任力提供了一些启示。  相似文献   

16.
Aggregate exposure metrics based on sums or weighted averages of component exposures are widely used in risk assessments of complex mixtures, such as asbestos-associated dusts and fibers. Allowed exposure levels based on total particle or fiber counts and estimated ambient concentrations of such mixtures may be used to make costly risk-management decisions intended to protect human health and to remediate hazardous environments. We show that, in general, aggregate exposure information alone may be inherently unable to guide rational risk-management decisions when the components of the mixture differ significantly in potency and when the percentage compositions of the mixture exposures differ significantly across locations. Under these conditions, which are not uncommon in practice, aggregate exposure metrics may be "worse than useless," in that risk-management decisions based on them are less effective than decisions that ignore the aggregate exposure information and select risk-management actions at random. The potential practical significance of these results is illustrated by a case study of 27 exposure scenarios in El Dorado Hills, California, where applying an aggregate unit risk factor (from EPA's IRIS database) to aggregate exposure metrics produces average risk estimates about 25 times greater - and of uncertain predictive validity - compared to risk estimates based on specific components of the mixture that have been hypothesized to pose risks of human lung cancer and mesothelioma.  相似文献   

17.
本文首先比较了三种目前主流的共跳检验方法:基于LM检验的共跳检验、BLT共跳检验和FHLL共跳检验,结果表明,三种方法在识别共跳数量上差距明显,但三者结果的重合部分基本属于市场暴涨暴跌行情,说明共跳识别对市场剧烈波动的聚集性较为敏感。基于跳跃、共跳存在的聚集性问题,本文将Hawkes过程引入跳跃和共跳的研究,构建了基于Hawkes过程的因子模型,结果显示,基于Hawkes因子模型的MJ统计量、CJ统计量和实证数据的拟合程度较好,表明因子模型能够更好地描述跳跃和共跳的聚集性。  相似文献   

18.
Common Learning     
Consider two agents who learn the value of an unknown parameter by observing a sequence of private signals. The signals are independent and identically distributed across time but not necessarily across agents. We show that when each agent's signal space is finite, the agents will commonly learn the value of the parameter, that is, that the true value of the parameter will become approximate common knowledge. The essential step in this argument is to express the expectation of one agent's signals, conditional on those of the other agent, in terms of a Markov chain. This allows us to invoke a contraction mapping principle ensuring that if one agent's signals are close to those expected under a particular value of the parameter, then that agent expects the other agent's signals to be even closer to those expected under the parameter value. In contrast, if the agents' observations come from a countably infinite signal space, then this contraction mapping property fails. We show by example that common learning can fail in this case.  相似文献   

19.
Combinatorial Optimization in Rapidly Mutating Drug-Resistant Viruses   总被引:1,自引:0,他引:1  
Resistance to chemicals is a common current problem in many pests and pathogens that formerly were controlled by chemicals. An extreme case occurs in rapidly mutating viruses such as Human Immunodeficiency Virus (HIV), where the emergence of selective drug resistance within an individual patient may become an important factor in treatment choice. The HIV patient subpopulation that already has experienced at least one treatment failure due to drug resistance is considered more challenging to treat because the treatment options have been reduced. A triply nested combinatorial optimization problem occurs in computational attempts to optimize HIV patient treatment protocol (drug regimen) with respect to drug resistance, given a set of HIV genetic sequences from the patient. In this paper the optimization problem is characterized, and the objects involved are represented computationally. An implemented branch-and-bound algorithm that computes a solution to the problem is described and proved correct. Data shown includes empirical timing results on representative patient data, example clinical output, and summary statistics from an initial small-scale human clinical trial.  相似文献   

20.
We model an organization as a game in which all agents share a common decision problem and some level of coordination is necessary between individual actions. Agents have individual private information concerning the task they have to perform, and they share this private information through pairwise channels of communication. We analyze how this communication pattern, modeled by means of a network structure, affects individual behavior and aggregate welfare. In the unique equilibrium of this Bayesian game, each agent's optimal action depends on a properly defined knowledge index that measures how the aggregation of information helps him to infer higher‐order beliefs about other's information after communication. Adding communication channels is not always beneficial for the organization because it can lead to mis‐coordination. We single out the geometry of interagent communication links that the manager could implement in order to improve the organization's performance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号