首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   69篇
  免费   2篇
  国内免费   1篇
管理学   2篇
综合类   1篇
统计学   69篇
  2021年   2篇
  2020年   1篇
  2019年   7篇
  2018年   10篇
  2017年   13篇
  2016年   8篇
  2014年   5篇
  2013年   11篇
  2012年   10篇
  2011年   1篇
  2010年   3篇
  2009年   1篇
排序方式: 共有72条查询结果,搜索用时 46 毫秒
21.
Censored median regression has proved useful for analyzing survival data in complicated situations, say, when the variance is heteroscedastic or the data contain outliers. In this paper, we study the sparse estimation for censored median regression models, which is an important problem for high dimensional survival data analysis. In particular, a new procedure is proposed to minimize an inverse-censoring-probability weighted least absolute deviation loss subject to the adaptive LASSO penalty and result in a sparse and robust median estimator. We show that, with a proper choice of the tuning parameter, the procedure can identify the underlying sparse model consistently and has desired large-sample properties including root-n consistency and the asymptotic normality. The procedure also enjoys great advantages in computation, since its entire solution path can be obtained efficiently. Furthermore, we propose a resampling method to estimate the variance of the estimator. The performance of the procedure is illustrated by extensive simulations and two real data applications including one microarray gene expression survival data.  相似文献   
22.
Risk factor selection is very important in the insurance industry, which helps precise rate making and studying the features of high‐quality insureds. Zero‐inflated data are common in insurance, such as the claim frequency data, and zero‐inflation makes the selection of risk factors quite difficult. In this article, we propose a new risk factor selection approach, EM adaptive LASSO, for a zero‐inflated Poisson regression model, which combines the EM algorithm and adaptive LASSO penalty. Under some regularity conditions, we show that, with probability approaching 1, important factors are selected and the redundant factors are excluded. We investigate the finite sample performance of the proposed method through a simulation study and the analysis of car insurance data from SAS Enterprise Miner database.  相似文献   
23.
In the past decades, the number of variables explaining observations in different practical applications increased gradually. This has led to heavy computational tasks, despite of widely using provisional variable selection methods in data processing. Therefore, more methodological techniques have appeared to reduce the number of explanatory variables without losing much of the information. In these techniques, two distinct approaches are apparent: ‘shrinkage regression’ and ‘sufficient dimension reduction’. Surprisingly, there has not been any communication or comparison between these two methodological categories, and it is not clear when each of these two approaches are appropriate. In this paper, we fill some of this gap by first reviewing each category in brief, paying special attention to the most commonly used methods in each category. We then compare commonly used methods from both categories based on their accuracy, computation time, and their ability to select effective variables. A simulation study on the performance of the methods in each category is generated as well. The selected methods are concurrently tested on two sets of real data which allows us to recommend conditions under which one approach is more appropriate to be applied to high-dimensional data.  相似文献   
24.
ABSTRACT

In this paper, we develop an efficient wavelet-based regularized linear quantile regression framework for coefficient estimations, where the responses are scalars and the predictors include both scalars and function. The framework consists of two important parts: wavelet transformation and regularized linear quantile regression. Wavelet transform can be used to approximate functional data through representing it by finite wavelet coefficients and effectively capturing its local features. Quantile regression is robust for response outliers and heavy-tailed errors. In addition, comparing with other methods it provides a more complete picture of how responses change conditional on covariates. Meanwhile, regularization can remove small wavelet coefficients to achieve sparsity and efficiency. A novel algorithm, Alternating Direction Method of Multipliers (ADMM) is derived to solve the optimization problems. We conduct numerical studies to investigate the finite sample performance of our method and applied it on real data from ADHD studies.  相似文献   
25.
Variable selection in finite mixture of regression (FMR) models is frequently used in statistical modeling. The majority of applications of variable selection in FMR models use a normal distribution for regression error. Such assumptions are unsuitable for a set of data containing a group or groups of observations with asymmetric behavior. In this paper, we introduce a variable selection procedure for FMR models using the skew-normal distribution. With appropriate choice of the tuning parameters, we establish the theoretical properties of our procedure, including consistency in variable selection and the oracle property in estimation. To estimate the parameters of the model, a modified EM algorithm for numerical computations is developed. The methodology is illustrated through numerical experiments and a real data example.  相似文献   
26.
27.
28.
在广义线性模型假设下,采用Lin的医疗费用模型,运用LASSO和SCAD方法对影响医疗费用的因素进行选择,并对两种方法的有效性进行了对比分析,从而得出影响医疗保险赔付的重要因素,解决了高维变量带来的一系列问题。实例分析中,由于两种方法注重的统计性质不同,选择出的解释变量略微不同,但通过分析发现,两种结果都具有良好的解释性,反映了影响医疗保险赔付的重要信息。  相似文献   
29.
This article considers the shrinkage estimation procedure in the Cox's proportional hazards regression model when it is suspected that some of the parameters may be restricted to a subspace. We have developed the statistical properties of the shrinkage estimators including asymptotic distributional biases and risks. The shrinkage estimators have much higher relative efficiency than the classical estimator, furthermore, we consider two penalty estimators—the LASSO and adaptive LASSO—and compare their relative performance with that of the shrinkage estimators numerically. A Monte Carlo simulation experiment is conducted for different combinations of irrelevant predictors and the performance of each estimator is evaluated in terms of simulated mean squared error. Simulation study shows that the shrinkage estimators are comparable to the penalty estimators when the number of irrelevant predictors in the model is relatively large. The shrinkage and penalty methods are applied to two real data sets to illustrate the usefulness of the procedures in practice.  相似文献   
30.
This article investigates the relevance of considering a large number of macroeconomic indicators to forecast the complete distribution of a variable. The baseline time series model is a semiparametric specification based on the quantile autoregressive (QAR) model that assumes that the quantiles depend on the lagged values of the variable. We then augment the time series model with macroeconomic information from a large dataset by including principal components or a subset of variables selected by LASSO. We forecast the distribution of the h-month growth rate for four economic variables from 1975 to 2011 and evaluate the forecast accuracy relative to a stochastic volatility model using the quantile score. The results for the output and employment measures indicate that the multivariate models outperform the time series forecasts, in particular at long horizons and in tails of the distribution, while for the inflation variables the improved performance occurs mostly at the 6-month horizon. We also illustrate the practical relevance of predicting the distribution by considering forecasts at three dates during the last recession.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号