首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   93篇
  免费   1篇
管理学   1篇
人口学   1篇
综合类   4篇
社会学   3篇
统计学   85篇
  2022年   1篇
  2019年   3篇
  2018年   5篇
  2017年   6篇
  2016年   4篇
  2015年   4篇
  2014年   5篇
  2013年   29篇
  2012年   5篇
  2011年   2篇
  2010年   4篇
  2009年   3篇
  2008年   7篇
  2007年   3篇
  2006年   1篇
  2005年   1篇
  2004年   1篇
  2003年   2篇
  2001年   1篇
  1999年   4篇
  1997年   1篇
  1994年   1篇
  1984年   1篇
排序方式: 共有94条查询结果,搜索用时 78 毫秒
81.
This study investigated the value of group career construction counseling in a high school context. The author used purposive sampling to select participants who had sought career counseling. A mixed‐methods intervention study design was also used. Participants (N = 57) completed the Career Adapt‐Abilities Scale–South Africa (CAAS‐SA) before the 1st and after the 2nd intervention. The Career Interest Profile and the Maree Career Matrix were used to facilitate the intervention, and the CAAS‐SA was used to test the research hypotheses. The findings revealed that the boys’ and the girls’ career adaptability had improved meaningfully on all of the CAAS‐SA subscales. No gender‐based differences were found. However, differences were detected between both the boys’ and the girls’ pre‐ and posttest Control and Confidence subscale scores. The findings demonstrate the value of career construction counseling in group settings. More longitudinal research with diverse participants is needed.  相似文献   
82.
83.
Semiparametric models: a generalized self-consistency approach   总被引:1,自引:0,他引:1  
Summary. In semiparametric models, the dimension d of the maximum likelihood problem is potentially unlimited. Conventional estimation methods generally behave like O ( d 3). A new O ( d ) estimation procedure is proposed for a large class of semiparametric models. Potentially unlimited dimension is handled in a numerically efficient way through a Nelson–Aalen-like estimator. Discussion of the new method is put in the context of recently developed minorization–maximization algorithms based on surrogate objective functions. The procedure for semiparametric models is used to demonstrate three methods to construct a surrogate objective function: using the difference of two concave functions, the EM way and the new quasi-EM (QEM) approach. The QEM approach is based on a generalization of the EM-like construction of the surrogate objective function so it does not depend on the missing data representation of the model. Like the EM algorithm, the QEM method has a dual interpretation, a result of merging the idea of surrogate maximization with the idea of imputation and self-consistency. The new approach is compared with other possible approaches by using simulations and analysis of real data. The proportional odds model is used as an example throughout the paper.  相似文献   
84.
Let X1,…,Xn be some i.i.d. observations from a heavy-tailed distribution F, i.e. the common distribution of the excesses over a high threshold un can be approximated by a generalized Pareto distribution Gγ,σn with γ>0. This paper deals with the problem of finding confidence regions for the couple (γ,σn): combining the empirical likelihood methodology with estimation equations (close but not identical to the likelihood equations) introduced by Zhang (2007), asymptotically valid confidence regions for (γ,σn) are obtained and proved to perform better than Wald-type confidence regions (especially those derived from the asymptotic normality of the maximum likelihood estimators). By profiling out the scale parameter, confidence intervals for the tail index are also derived.  相似文献   
85.
One important type of question in statistical inference is how to interpret data as evidence. The law of likelihood provides a satisfactory answer in interpreting data as evidence for simple hypotheses, but remains silent for composite hypotheses. This article examines how the law of likelihood can be extended to composite hypotheses within the scope of the likelihood principle. From a system of axioms, we conclude that the strength of evidence for the composite hypotheses should be represented by an interval between lower and upper profiles likelihoods. This article is intended to reveal the connection between profile likelihoods and the law of likelihood under the likelihood principle rather than argue in favor of the use of profile likelihoods in addressing general questions of statistical inference. The interpretation of the result is also discussed.  相似文献   
86.
The case fatality rate is an important indicator of the severity of a disease, and unbiased and accurate estimates of it during an outbreak are important in the study of epidemic diseases, including severe acute respiratory syndrome (SARS). In this paper, estimation methods are developed using a constant cure-death hazard ratio. A semiparametric model is presented, in which the cure-death hazard ratio is a parameter of interest, and a profile likelihood-based technique is proposed for estimating the case fatality rate. An extensive simulation was carried out to investigate the performance of this technique for small and medium sample sizes, using both summary and individual data. The results show that the performance depends on the model validity but is not heavily dependent on the sample size. The method was applied to summary SARS data obtained from Hong Kong and Singapore.  相似文献   
87.
In this paper, we combine empirical likelihood and estimating functions for censored data to obtain robust confidence regions for the parameters and more generally for functions of the parameters of distributions used in lifetime data analysis. The proposed method works with type I, type II or randomly censored data. It is illustrated by considering inference for log-location-scale models. In particular, we focus on the log-normal and the Weibull models and we tackle the problem of constructing robust confidence regions (or intervals) for the parameters of the model, as well as for quantiles and values of the survival function. The usefulness of the method is demonstrated through a Monte Carlo study and by examples on two lifetime data sets.  相似文献   
88.
The maximum likelihood and maximum partial likelihood approaches to the proportional hazards model are unified. The purpose is to give a general approach to the analysis of the proportional hazards model, whether the baseline distribution is absolutely continuous, discrete, or a mixture. The advantage is that heavily tied data will be analyzed with a discrete time model, while data with no ties is analyzed with ordinary Cox regression. Data sets in between are treated by a compromise between the discrete time model and Efron's approach to tied data in survival analysis, and the transitions between modes are automatic. A simulation study is conducted comparing the proposed approach to standard methods of handling ties. A recent suggestion, that revives Breslow's approach to tied data, is finally discussed.  相似文献   
89.
Kernel-based profile estimation (KBPE) is proposed for the partially measured ODEs. Compared to the existing approaches the structure information contained in ODEs is used more efficiently in KBPE and no higher order derivatives need to be estimated form the measurements. Construction of confidence interval in finite samples setting for both parameters and state variables are also discussed. Simulation studies show that KBPE can estimate the partially measured ODEs reasonably when the ordinary two-step approach cannot apply. We also illustrate KBPE by a real data set from a clinical HIV study.  相似文献   
90.
Abstract

In this article two methods are proposed to make inferences about the parameters of a finite mixture of distributions in the context of partially identifiable censored data. The first method focuses on a mixture of location and scale models and relies on an asymptotic approximation to a suitably constructed augmented likelihood; the second method provides a full Bayesian analysis of the mixture based on a Gibbs sampler. Both methods make explicit use of latent variables and provide computationally efficient procedures compared to other methods which deal directly with the likelihood of the mixture. This may be crucial if the number of components in the mixture is not small. Our proposals are illustrated on a classical example on failure times for communication devices first studied by Mendenhall and Hader (Mendenhall, W., Hader, R. J. (1958 Mendenhall, W. and Hader, R. J. 1958. Estimation of parameters of mixed exponentially distributed failure time distributions from censored life test data. Biometrika, 45: 504520. [Crossref], [Web of Science ®] [Google Scholar]). Estimation of parameters of mixed exponentially distributed failure time distributions from censored life test data. Biometrika 45:504–520.). In addition, we study the coverage of the confidence intervals obtained from each of the methods by means of a small simulation exercise.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号