首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 471 毫秒
1.
One of the challenges of introducing greater biological realism into stochastic models of cancer induction is to find a way to represent the homeostatic control of the normal cell population over its own size without complicating the analysis too much to obtain useful results. Current two-stage models of carcinogenesis typically ignore homeostatic control. Instead, a deterministic growth path is specified for the population of "normal" cells, while the population of "initiated" cells is assumed to grow randomly according to a birth-death process with random immigrations from the normal population. This paper introduces a simple model of homeostatically controlled cell division for mature tissues, in which the size of the nonmalignant population remains essentially constant over time. Growth of the nonmalignant cell population (normal and initiated cells) is restricted by allowing cells to divide only to fill the "openings" left by cells that die or differentiate, thus maintaining the constant size of the nonmalignant cell population. The fundamental technical insight from this model is that random walks, rather than birth-and-death processes, are the appropriate stochastic processes for describing the kinetics of the initiated cell population. Qualitative and analytic results are presented, drawn from the mathematical theories of random walks and diffusion processes, that describe the probability of spontaneous extinction and the size distribution of surviving initiated populations when the death/differentiation rates of normal and initiated cells are known. The constraint that the nonmalignant population size must remain approximately constant leads to much simpler analytic formulas and approximations, flowing directly from random walk theory, than in previous birth-death models.(ABSTRACT TRUNCATED AT 250 WORDS)  相似文献   

2.
3.
Real-world exposure measurements are a necessary ingredient for subsequent detailed study of the risks from an environmental pollutant. For volatile organic compounds, researchers are applying exhaled breath analysis and the time dependence of concentrations as a noninvasive indicator of exposure, dose, and blood levels. To optimize the acquisition of such data, samples must be collected in a time frame suited to the needs of the mathematical model, within physical limitations of the equipment and subjects, and within logistical constraints. Additionally, one must consider the impact of measurement error on the eventual extraction of biologically and physiologically relevant parameters. Given a particular mathematical model for the elimination kinetics (in this case a very simple pharmacokinetic model based upon a multitenn exponential decay function that has been shown to fit real-world data extremely well), we investigated the effects on synthetic data caused by sample timing, random measurement error, and number of terms included in the model. This information generated a series of conditions for collecting samples and performing analyses dependent upon the eventual informational needs, and it provided an estimate of error associated with various choices and compromises. Though the work was geared specifically toward breath sampling, it is equally applicable to direct blood measurements in optimizing sampling strategy and improving the exposure assessment process.  相似文献   

4.
The abilities of cells of a particular type of bacteria to leave lag phase and begin the process of dividing or surviving heat treatment can depend on the serotypes or strains of the bacteria. This article reports an investigation of serotype-specific differences in growth and heat resistance kinetics of clinical and food isolates of Salmonella. Growth kinetics at 19 degrees C and 37 degrees C were examined in brain heart infusion broth and heat resistance kinetics for 60 degrees C were examined in beef gravy using a submerged coil heating apparatus. Estimates of the parameters of the growth curves suggests a small between-serotype variance of the growth kinetics. However, for inactivation, the results suggest a significant between-serotype effect on the asymptotic D-values, with an estimated between-serotype CV of about 20%. In microbial risk assessment, predictive microbiology is used to estimate growth and inactivation of pathogens. Often the data used for estimating the growth or inactivation kinetics are based on measurements on a cocktail--a mixture of approximately equal proportions of several serotypes or strains of the pathogen being studied. The expected growth or inactivation rates derived from data using cocktails are biased, reflecting the characteristics of the fastest growing or most heat resistant serotype of the cocktail. In this article, an adjustment to decrease this possible bias in a risk assessment is offered. The article also presents discussion of the effect on estimating growth when stochastic assumptions are incorporated in the model. In particular, equations describing the variation of relative growth are derived, accounting for the stochastic variations of the division of cells. For small numbers of cells, the expected value of the relative growth is not an appropriate "representative" value for actual relative growths that might occur.  相似文献   

5.
Friedman等人认为通过市场机制,证券市场能够在长期消除非理性交易者。为了对这个观点进行严格的定量分析,本文提出了一个应用生灭过程研究非理性交易者和理性交易者共存的证券市场演化模型,用生灭过程刻画理性交易者和非理性交易者在证券市场中增减变化的动态规律。结论表明非理性交易者即使平均而言是亏损的,他们也未必在证券市场中消失,非理性交易者在市场中的生存与消亡取决于其初始财富、其进入市场的速率以及其交易的盈亏情况等多种因素。  相似文献   

6.
In many animal model systems for carcinogenesis, well characterized putative premalignant lesions are observed. A much studied example is provided by the enzyme altered foci in rodent hepatocarcinogenesis experiments. In a recent paper, we proposed a method for the quantitative analysis of such premalignant lesions. The model used in that paper assumed that the mean growth of premalignant clones is exponential. However, it has been suggested that such a model is oversimplified. In this paper, we relax the assumption of exponential mean growth. The new model contains one extra parameter that measures departures from exponentiality. Use of the model is illustrated by analysis of ATPase deficient foci in the liver of rats given NNM (N-nitrosomorpholine) in their drinking water. The analysis suggests that the clonal growth of altered cells is significantly accelerated (superexponential) for nontoxic doses of NNM. Finally, the hazard function of the two-mutation model for carcinogenesis is briefly discussed under nonexponential (mean) growth of intermediate cells.  相似文献   

7.
This paper considers a production process in which product quality follows a stochastic drifting process during production. The drift time is assumed to be dependent on the process state and the deterioration rate increases when the process state deviates from its target. The discrete geometric jump model is studied first and then continuous approximation is examined for the drifting process. Optimal adjustment level is derived from a cost model.  相似文献   

8.
Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.  相似文献   

9.
In recent years physiologically based pharmacokinetic models have come to play an increasingly important role in risk assessment for carcinogens. The hope is that they can help open the black box between external exposure and carcinogenic effects to experimental observations, and improve both high-dose to low-dose and interspecies projections of risk. However, to date, there have been only relatively preliminary efforts to assess the uncertainties in current modeling results. In this paper we compare the physiologically based pharmacokinetic models (and model predictions of risk-related overall metabolism) that have been produced by seven different sets of authors for perchloroethylene (tetrachloroethylene). The most striking conclusion from the data is that most of the differences in risk-related model predictions are attributable to the choice of the data sets used for calibrating the metabolic parameters. Second, it is clear that the bottom-line differences among the model predictions are appreciable. Overall, the ratios of low-dose human to bioassay rodent metabolism spanned a 30-fold range for the six available human/rat comparisons, and the seven predicted ratios of low-dose human to bioassay mouse metabolism spanned a 13-fold range. (The greater range for the rat/human comparison is attributable to a structural assumption by one author group of competing linear and saturable pathways, and their conclusion that the dangerous saturable pathway constitutes a minor fraction of metabolism in rats.) It is clear that there are a number of opportunities for modelers to make different choices of model structure, interpretive assumptions, and calibrating data in the process of constructing pharmacokinetic models for use in estimating "delivered" or "biologically effective" dose for carcinogenesis risk assessments. We believe that in presenting the results of such modeling studies, it is important for researchers to explore the results of alternative, reasonably likely approaches for interpreting the available data--and either show that any conclusions they make are relatively insensitive to particular interpretive choices, or to acknowledge the differences in conclusions that would result from plausible alternative views of the world.  相似文献   

10.
Estimation of Unit Risk for Coke Oven Emissions   总被引:1,自引:0,他引:1  
In 1984, based on epidemiological data on cohorts of coke oven workers, USEPA estimated a unit risk for lung cancer associated with continuous exposure from birth to 1 pg/m3 of coke oven emissions, of 6.2 × This risk assessment was based on information on the cohorts available through 1966. Follow-up of these cohorts has now been extended to 1982 and, moreover, individual job histories, which were not available in 1984, have been constructed. In this study, lung cancer mortality in these cohorts of coke oven workers with extended follow-up was analyzed using standard techniques of survival analysis and a new approach based on the two stage clonal expansion model of carcinogenesis. The latter approach allows the explicit consideration of detailed patterns of exposure of each individual in the cohort. The analyses used the extended follow-up data through 1982 and the detailed job histories now available. Based on these analyses, the best estimate of unit risk is 1.5 × with 95% confidence interval = 1.2 × 10-"1.8 X  相似文献   

11.
宏观经济领域中存在严重的结构突变性,模型估计量的优劣对估计样本规模是敏感的。本文针对时变参数模型,建立了滚动窗宽选择标准,通过最小化估计量的近似二次损失函数及最大化各子样本估计量间的曼哈顿距离选择窗宽大小,权衡了模型估计量的准确性和时变性两个相悖目标。蒙特卡罗模拟实验表明,本文所提出的方法在各种结构突变情形下均适用,能够应用于线性关系和非线性关系的时变参数模型中,且均具有稳健性。将该方法应用于我国金融网络的结构突变识别过程,显著改善了传统窗宽选择方法的结果。  相似文献   

12.
Knowledge discovery in databases (KDD) provides organizations necessary tools to sift through vast data stores to extract knowledge. This process supports and improves decision making in organizations. In this paper, we introduce and define the concept of knowledge refreshing, a critical step to ensure the quality and timeliness of knowledge discovered in a KDD process. This has been unfortunately overlooked by prior researchers. Specifically, we study knowledge refreshing from the perspective of when to refresh knowledge so that the total system cost over a time horizon is minimized. We propose a model for knowledge refreshing, and a dynamic programming methodology for developing optimal strategies. We demonstrate the effectiveness of the proposed methodology using data from a real world application. The proposed methodology provides decision makers guidance in running KDD effectively and efficiently.  相似文献   

13.
Quantile regression (QR) fits a linear model for conditional quantiles just as ordinary least squares (OLS) fits a linear model for conditional means. An attractive feature of OLS is that it gives the minimum mean‐squared error linear approximation to the conditional expectation function even when the linear model is misspecified. Empirical research using quantile regression with discrete covariates suggests that QR may have a similar property, but the exact nature of the linear approximation has remained elusive. In this paper, we show that QR minimizes a weighted mean‐squared error loss function for specification error. The weighting function is an average density of the dependent variable near the true conditional quantile. The weighted least squares interpretation of QR is used to derive an omitted variables bias formula and a partial quantile regression concept, similar to the relationship between partial regression and OLS. We also present asymptotic theory for the QR process under misspecification of the conditional quantile function. The approximation properties of QR are illustrated using wage data from the U.S. census. These results point to major changes in inequality from 1990 to 2000.  相似文献   

14.
Leadership research has a long history of a quantitative approach, and it remains the most commonly used approach among leadership researchers. Researchers in a variety of fields have been applying mixed methods designs to their research as a way to advance theory. Mixed methods designs are used for collecting, analyzing, and mixing both quantitative and qualitative data in a single study or series of studies to both explain and explore specific research questions. This article provides a review of the basic characteristics of mixed methods designs. A broad series of leadership approaches is offered to help emphasize how the application of mixed methods designs have already been applied and where they might be directed in future research. Our review of articles published in the Leadership Quarterly between 1990 and June 2012 revealed a slight occurrence of existing application of mixed methods designs to leadership research. Of the articles reviewed, only 15 studies were found to represent mixed methods research, according to our conceptual framework. The overall intent of this article is to highlight the value of purposeful application of mixed methods designs toward advancing leadership theory and/or theoretical thinking about leadership phenomena.  相似文献   

15.
Model averaging for dichotomous dose–response estimation is preferred to estimate the benchmark dose (BMD) from a single model, but challenges remain regarding implementing these methods for general analyses before model averaging is feasible to use in many risk assessment applications, and there is little work on Bayesian methods that include informative prior information for both the models and the parameters of the constituent models. This article introduces a novel approach that addresses many of the challenges seen while providing a fully Bayesian framework. Furthermore, in contrast to methods that use Monte Carlo Markov Chain, we approximate the posterior density using maximum a posteriori estimation. The approximation allows for an accurate and reproducible estimate while maintaining the speed of maximum likelihood, which is crucial in many applications such as processing massive high throughput data sets. We assess this method by applying it to empirical laboratory dose–response data and measuring the coverage of confidence limits for the BMD. We compare the coverage of this method to that of other approaches using the same set of models. Through the simulation study, the method is shown to be markedly superior to the traditional approach of selecting a single preferred model (e.g., from the U.S. EPA BMD software) for the analysis of dichotomous data and is comparable or superior to the other approaches.  相似文献   

16.
Understanding the nature of service failures and their impact on customer responses and designing cost‐effective recovery strategies have been recognized as important issues by both service researchers and practitioners. We first propose a conceptual framework of service failure and recovery strategies. We then transform it into a mathematical model to assist managers in deciding on appropriate resource allocations for outcome and process recovery strategies based on customer risk profiles and the firm's cost structures. Based on this mathematical model we derive optimal recovery strategies, conduct sensitivity analyses of the optimal solutions for different model parameters, and illustrate them through numerical examples. We conclude with a discussion of managerial implications and directions for future research.  相似文献   

17.
《Omega》2001,29(3):249-272
There have been many survey papers in the area of project scheduling in recent years. These papers have primarily emphasized modeling and algorithmic contributions for specific classes of project scheduling problems, such as net present value (NPV) maximization and makespan minimization, with and without resource constraints. Paralleling these developments has been the research in the area of project scheduling decision support, with its emphasis on data sets, data generation methods, and so on, that are essential to benchmark, evaluate, and compare the new models, algorithms and heuristic techniques. These investigations have extended the frontiers of research and application in all areas of project scheduling and management. In this paper, we survey the vast literature in this area with a perspective that integrates models, data, and optimal and heuristic algorithms, for the major classes of project scheduling problems. We also include recent surveys that have compared commercial project scheduling systems. Finally, we present an overview of web-based decision support systems and discuss the potential of this technology in enabling and facilitating researchers and practitioners in identifying new areas of inquiry and application.  相似文献   

18.
实际工程中很多系统的任务都具有多个阶段,且不同阶段的外界环境及系统需求均可能发生变化,而工作元件可在任意阶段发生失效,为系统的可靠性分析与建模带来了挑战。为增加系统的可靠性,一种常用的设计方法是配置备份元件。针对多阶段任务下基于需求的温备份系统可靠性建模问题,首先提出了一种基于多值决策图的方法,其次给出了处于不同状态的元件可靠性计算公式,最后通过算例表明该模型适用于具有多阶段任务需求的温备份系统可靠性分析。  相似文献   

19.
A two-mutation model for carcinogenesis is reviewed. General principles in fitting the model to epidemiologic and experimental data are discussed, and some examples are given. A general solution to the model with time-dependent parameters is developed, and its use is illustrated by application to data from an experiment in which rats exposed to radon developed lung tumors.  相似文献   

20.
Productivity differences across firms are large and persistent, but the evidence for worker reallocation as an important source of aggregate productivity growth is mixed. The purpose of this paper is to estimate the structure of an equilibrium model of growth through innovation designed to identify and quantify the role of resource reallocation in the growth process. The model is a version of the Schumpeterian theory of firm evolution and growth developed by Klette and Kortum (2004) extended to allow for firm heterogeneity. The data set is a panel of Danish firms that includes information on value added, employment, and wages. The model's fit is good. The estimated model implies that more productive firms in each cohort grow faster and consequently crowd out less productive firms in steady state. This selection effect accounts for 53% of aggregate growth in the estimated version of the model.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号