首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Sensitivity analysis for unmeasured confounding should be reported more often, especially in observational studies. In the standard Cox proportional hazards model, this requires substantial assumptions and can be computationally difficult. The marginal structural Cox proportional hazards model (Cox proportional hazards MSM) with inverse probability weighting has several advantages compared to the standard Cox model, including situations with only one assessment of exposure (point exposure) and time-independent confounders. We describe how simple computations provide sensitivity for unmeasured confounding in a Cox proportional hazards MSM with point exposure. This is achieved by translating the general framework for sensitivity analysis for MSMs by Robins and colleagues to survival time data. Instead of bias-corrected observations, we correct the hazard rate to adjust for a specified amount of unmeasured confounding. As an additional bonus, the Cox proportional hazards MSM is robust against bias from differential loss to follow-up. As an illustration, the Cox proportional hazards MSM was applied in a reanalysis of the association between smoking and depression in a population-based cohort of Norwegian adults. The association was moderately sensitive for unmeasured confounding.  相似文献   

2.
We present a novel real-time univariate monitoring scheme for detecting a sustained departure of a process mean from some given standard assuming a constant variance. Our proposed stopping rule is based on the total variation of a nonparametric taut string estimator of the process mean and is designed to provide a desired average run length for an in-control situation. Compared to the more prominent CUSUM fast initial response (FIR) methodology and allowing for a restart following a false alarm, the proposed two-sided taut string (TS) scheme produces a significant reduction in average run length for a wide range of changes in the mean that occur at or immediately after process monitoring begins. A decision rule for when to choose our proposed TS chart compared to the CUSUM FIR chart that takes into account both false alarm rate and average run length to detect a shift in the mean is proposed and implemented. Supplementary materials are available online.  相似文献   

3.
Indices of population ‘health need’ are often used to distribute health resources or assess equity in service provision. This article describes a spatial structural equation model incorporating multiple indicators of need and multiple population health risks that affect need (analogous to multiple indicators–multiple causes models). More specifically, the multiple indicator component of the model involves health outcomes such as hospital admissions or mortality, whereas the multiple risk component models the impact on the need for area social and demographic indicators, which proxy population-level risk factors for different diseases. The latent need construct is allowed (under a Bayesian approach) to be spatially correlated, though the prior assumed for need allows a mix of spatially structured and unstructured influences. A case study considers variations in need for coronary heart disease (CHD) care over 625 small areas in London, using recent mortality and hospitalization data (the ‘indicators’) and measures of general ill-health, income and unemployment, which proxy variations in population risk for CHD.  相似文献   

4.
We evaluate the effects of college choice on earnings using Swedish register databases. This case study is used to motivate the introduction of a novel procedure to analyse the sensitivity of such an observational study to the assumption made that there are no unobserved confounders – variables affecting both college choice and earnings. This assumption is not testable without further information, and should be considered an approximation of reality. To perform a sensitivity analysis, we measure the departure from the unconfoundedness assumption with the correlation between college choice and earnings when conditioning on observed covariates. The use of a correlation as a measure of dependence allows us to propose a standardised procedure by advocating the use of a fixed value for the correlation, typically 1% or 5%, when checking the sensitivity of an evaluation study. A correlation coefficient is, moreover, intuitive to most empirical scientists, which makes the results of our sensitivity analysis easier to communicate than those of previously proposed methods. In our evaluation of the effects of college choice on earnings, the significantly positive effect obtained could not be questioned by a sensitivity analysis allowing for unobserved confounders inducing at most 5% correlation between college choice and earnings.  相似文献   

5.
This paper is an applied analysis of the causal structure of linear multi-equational econometric models. Its aim is to identify the kind of relationships linking the endogenous variables of the model, distinguishing between causal links and feedback loops. The investigation is first carried out within a deterministic framework and then moves on to show how the results may change inside a more realistic stochastic context. The causal analysis is then specifically applied to a linear simultaneous equation model explaining fertility rates. The analysis is carried out by means of a specific RATS programming code designed to show the specific nature of the relationships within the model.  相似文献   

6.
Summary.  The primary goal of multivariate statistical process performance monitoring is to identify deviations from normal operation within a manufacturing process. The basis of the monitoring schemes is historical data that have been collected when the process is running under normal operating conditions. These data are then used to establish confidence bounds to detect the onset of process deviations. In contrast with the traditional approaches that are based on the Gaussian assumption, this paper proposes the application of the infinite Gaussian mixture model (GMM) for the calculation of the confidence bounds, thereby relaxing the previous restrictive assumption. The infinite GMM is a special case of Dirichlet process mixtures and is introduced as the limit of the finite GMM, i.e. when the number of mixtures tends to ∞. On the basis of the estimation of the probability density function, via the infinite GMM, the confidence bounds are calculated by using the bootstrap algorithm. The methodology proposed is demonstrated through its application to a simulated continuous chemical process, and a batch semiconductor manufacturing process.  相似文献   

7.
A bivariate generalized linear model is developed as a mixture distribution with one component of the mixture being discrete with probability mass only at the origin. The use of the proposed model is illustrated by analyzing local area meteorological measurements with constant correlation structure that incorporates predictor variables. The Monte Carlo study is performed to evaluate the inferential efficiency of model parameters for two types of true models. These results suggest that the estimates of regression parameters are consistent and the efficiency of the inference increases for the proposed model for ρ≥0.50 especially in larger samples. As an illustration of a bivariate generalized linear model, we analyze a precipitation monitoring data of adjacent local stations for Tokyo and Yokohama.  相似文献   

8.
Statistical analysis consists of two phases: induction for model parameter estimation and deduction to make decisions on the basis of the statistical model. In the Bayesian context, predictive analysis is the key concept to perform the deductive phase. In that context, Monte-Carlo posterior simulations are shown to be extremely valuable tools to achieve for instance model selection and model checking. Example of predictive analysis by simulation is detailed for the linear model with Autocorrelated Errors which has been beforehand estimated by Gibbs sampling. Numerical illustrations are then given for a food process with data collected on line. Special attention is cast on the control of its anticipated behavior under uncertainty within Bayesian decision theory.  相似文献   

9.
The use of asymptotic moments to increase the precision of the control variate technique for Monte Carlo estimation is dis­cussed. An application is made to the estimation of the mean and variance of the likelihood ratio goodness–of–fit statistic with the Pearson statistic used as a control variate. Estimates of the variance reductions are given.  相似文献   

10.
The Finnish common toad data of Heikkinen and Hogmander are reanalysed using an alternative fully Bayesian model that does not require a pseudolikelihood approximation and an alternative prior distribution for the true presence or absence status of toads in each 10 km×10 km square. Markov chain Monte Carlo methods are used to obtain posterior probability estimates of the square-specific presences of the common toad and these are presented as a map. The results are different from those of Heikkinen and Hogmander and we offer an explanation in terms of the prior used for square-specific presence of the toads. We suggest that our approach is more faithful to the data and avoids unnecessary confounding of effects. We demonstrate how to extend our model efficiently with square-specific covariates and illustrate this by introducing deterministic spatial changes.  相似文献   

11.
Merger and acquisition is an important corporate strategy. We collect recent merger and acquisition data for companies on the China A-share stock market to explore the relationship between corporate ownership structure and speed of merger success. When studying merger success, selection bias occurs if only completed mergers are analyzed. There is also a censoring problem when duration time is used to measure the speed. In this article, for time-to-event outcomes, we propose a semiparametric version of the type II Tobit model that can simultaneously handle selection bias and right censoring. The proposed model can also easily incorporate time-dependent covariates. A nonparametric maximum likelihood estimator is proposed. The resulting estimators are shown to be consistent, asymptotically normal, and semiparametrically efficient. Some Monte Carlo studies are carried out to assess the finite-sample performance of the proposed approach. Using the proposed model, we find that higher power balance of a company is associated with faster merger success.  相似文献   

12.
We propose a regime switching autoregressive model and apply it to analyze daily water discharge series of River Tisza in Hungary. The dynamics is governed by two regimes, along which both the autoregressive coefficients and the innovation distributions are altering, moreover, the hidden regime indicator process is allowed to be non-Markovian. After examining stationarity and basic properties of the model, we turn to its estimation by Markov Chain Monte Carlo (MCMC) methods and propose two algorithms. The values of the latent process serve as auxiliary parameters in the first one, while the change points of the regimes do the same in the second one in a reversible jump MCMC setting. After comparing the mixing performance of the two methods, the model is fitted to the water discharge data. Simulations show that it reproduces the important features of the water discharge series such as the highly skewed marginal distribution and the asymmetric shape of the hydrograph.  相似文献   

13.
We extend the Bayesian Model Averaging (BMA) framework to dynamic panel data models with endogenous regressors using a Limited Information Bayesian Model Averaging (LIBMA) methodology. Monte Carlo simulations confirm the asymptotic performance of our methodology both in BMA and selection, with high posterior inclusion probabilities for all relevant regressors, and parameter estimates very close to their true values. In addition, we illustrate the use of LIBMA by estimating a dynamic gravity model for bilateral trade. Once model uncertainty, dynamics, and endogeneity are accounted for, we find several factors that are robustly correlated with bilateral trade. We also find that applying methodologies that do not account for either dynamics or endogeneity (or both) results in different sets of robust determinants.  相似文献   

14.
Longitudinal clinical trials with long follow-up periods almost invariably suffer from a loss to follow-up and non-compliance with the assigned therapy. An example is protocol 128 of the AIDS Clinical Trials Group, a 5-year equivalency trial comparing reduced dose zidovudine with the standard dose for treatment of paediatric acquired immune deficiency syndrome patients. This study compared responses to treatment by using both clinical and cognitive outcomes. The cognitive outcomes are of particular interest because the effects of human immunodeficiency virus infection of the central nervous system can be more acute in children than in adults. We formulate and apply a Bayesian hierarchical model to estimate both the intent-to-treat effect and the average causal effect of reducing the prescribed dose of zidovudine by 50%. The intent-to-treat effect quantifies the causal effect of assigning the lower dose, whereas the average causal effect represents the causal effect of actually taking the lower dose. We adopt a potential outcomes framework where, for each individual, we assume the existence of a different potential outcomes process at each level of time spent on treatment. The joint distribution of the potential outcomes and the time spent on assigned treatment is formulated using a hierarchical model: the potential outcomes distribution is given at the first level, and dependence between the outcomes and time on treatment is specified at the second level by linking the time on treatment to subject-specific effects that characterize the potential outcomes processes. Several distributional and structural assumptions are used to identify the model from observed data, and these are described in detail. A detailed analysis of AIDS Clinical Trials Group protocol 128 is given; inference about both the intent-to-treat effect and average causal effect indicate a high probability of dose equivalence with respect to cognitive functioning.  相似文献   

15.
The problem of simultaneously estimating p normal variances is investigated when the parameters are believed a priori to be similar in size. A hierarchical Bayes approach is employed and the resulting estimator is compared to common estimators used including one proposed by Box and Tiao (1973) using a Bayesian approach with a noninformative prior. The technique is then applied to estimate components of variance in the one way layout random effect model of the analysis of variance.  相似文献   

16.
In this paper, we study the indentifiability of a latent random effect model for the mixed correlated continuous and ordinal longitudinal responses. We derive conditions for the identifiability of the covariance parameters of the responses. Also, we proposed sensitivity analysis to investigate the perturbation from the non-identifiability of the covariance parameters, it is shown how one can use some elements of covariance structure. These elements associate conditions for identifiability of the covariance parameters of the responses. Influence of small perturbation of these elements on maximal normal curvature is also studied. The model is illustrated using medical data.  相似文献   

17.
Hedonic price models are commonly used in the study of markets for various goods, most notably those for wine, art, and jewelry. These models were developed to estimate implicit prices of product attributes within a given product class, where in the case of some goods, such as wine, substantial product differentiation exists. To address this issue, recent research on wine prices employs local polynomial regression clustering (LPRC) for estimating regression models under class uncertainty. This study demonstrates that a superior empirical approach – estimation of a mixture model – is applicable to a hedonic model of wine prices, provided only that the dependent variable in the model is rescaled. The present study also catalogues several of the advantages over LPRC modeling of estimating mixture models.  相似文献   

18.
Modelling of the relationship between concentration (PK) and response (PD) plays an important role in drug development. The modelling becomes complicated when the drug concentration and response measurements are not taken simultaneously and/or hysteresis occurs between the response and the concentration. A model‐based approach fits a joint pharmacokinetic (PK) and concentration–response (PK/PD) model, including an effect compartment if necessary, to concentration and response data. However, this approach relies on the PK data being well described by a common PK model. We propose an algorithm for a semi‐parametric approach to fitting nonlinear mixed PK/PD models including an effect compartment using linear interpolation and extrapolation for concentration data. This approach is independent of the PK model, and the algorithm can easily be implemented using SAS PROC NLMIXED. Practical issues in programming and computing are also discussed. The properties of this approach are examined using simulations. This approach is used to analyse data from a study of the PK/PD relationship between insulin and glucose levels. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

19.
In this paper we propose a new lifetime model for multivariate survival data in presence of surviving fractions and examine some of its properties. Its genesis is based on situations in which there are m types of unobservable competing causes, where each cause is related to a time of occurrence of an event of interest. Our model is a multivariate extension of the univariate survival cure rate model proposed by Rodrigues et al. [37 J. Rodrigues, V.G. Cancho, M. de Castro, and F. Louzada-Neto, On the unification of long-term survival models, Statist. Probab. Lett. 79 (2009), pp. 753759. doi: 10.1016/j.spl.2008.10.029[Crossref], [Web of Science ®] [Google Scholar]]. The inferential approach exploits the maximum likelihood tools. We perform a simulation study in order to verify the asymptotic properties of the maximum likelihood estimators. The simulation study also focus on size and power of the likelihood ratio test. The methodology is illustrated on a real data set on customer churn data.  相似文献   

20.
A new solution is proposed for a sparse data problem arising in nonparametric estimation of a bivariate survival function. Prior information, if available, can be used to obtain initial values for the EM algorithm. Initial values will completely determine estimates of portions of the distribution which are not identifiable from the data, while having a minimal effect on estimates of portions of the distribution for which the data provide sufficient information. Methods are applied to the distribution of women's age at first marriage and age at birth of first child, using data from the Current Population Surveys of 1975 and 1986.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号