首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   92篇
  免费   3篇
管理学   13篇
民族学   2篇
人口学   3篇
理论方法论   8篇
综合类   1篇
社会学   41篇
统计学   27篇
  2022年   1篇
  2021年   2篇
  2019年   3篇
  2018年   1篇
  2017年   5篇
  2016年   3篇
  2015年   3篇
  2014年   4篇
  2013年   24篇
  2012年   6篇
  2011年   2篇
  2010年   2篇
  2009年   1篇
  2008年   2篇
  2007年   3篇
  2006年   5篇
  2005年   5篇
  2004年   1篇
  2003年   1篇
  2002年   1篇
  2001年   1篇
  2000年   2篇
  1999年   1篇
  1997年   1篇
  1996年   1篇
  1993年   2篇
  1992年   1篇
  1990年   1篇
  1986年   1篇
  1984年   3篇
  1979年   1篇
  1977年   2篇
  1976年   1篇
  1974年   2篇
排序方式: 共有95条查询结果,搜索用时 234 毫秒
61.
The author considers studies with multiple dependent primary endpoints. Testing hypotheses with multiple primary endpoints may require unmanageably large populations. Composite endpoints consisting of several binary events may be used to reduce a trial to a manageable size. The primary difficulties with composite endpoints are that different endpoints may have different clinical importance and that higher‐frequency variables may overwhelm effects of smaller, but equally important, primary outcomes. To compensate for these inconsistencies, we weight each type of event, and the total number of weighted events is counted. To reflect the mutual dependency of primary endpoints and to make the weighting method effective in small clinical trials, we use the Bayesian approach. We assume a multinomial distribution of multiple endpoints with Dirichlet priors and apply the Bayesian test of noninferiority to the calculation of weighting parameters. We use composite endpoints to test hypotheses of superiority in single‐arm and two‐arm clinical trials. The composite endpoints have a beta distribution. We illustrate this technique with an example. The results provide a statistical procedure for creating composite endpoints. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.  相似文献   
62.
Sequential multi-chart detection procedures for detecting changes in multichannel sensor systems are developed. In the case of complete information on pre-change and post-change distributions, the detection algorithm represents a likelihood ratio-based multichannel generalization of Page’s cumulative sum (CUSUM) test that is applied to general stochastic models that may include correlated and nonstationary observations. There are many potential application areas where it is necessary to consider multichannel generalizations and general statistical models. In this paper our main motivation for doing so is network security: rapid anomaly detection for an early detection of attacks in computer networks that lead to changes in network traffic. Moreover, this kind of application encourages the development of a nonparametric multichannel detection test that does not use exact pre-change (legitimate) and post-change (attack) traffic models. The proposed nonparametric method can be effectively applied to detect a wide variety of attacks such as denial-of-service attacks, worm-based attacks, port-scanning, and man-in-the-middle attacks. In addition, we propose a multichannel CUSUM procedure that is based on binary quantized data; this procedure turns out to be more efficient than the previous two algorithms in certain scenarios. All proposed detection algorithms are based on the change-point detection theory. They utilize the thresholding of test statistics to achieve a fixed rate of false alarms, while allowing changes in statistical models to be detected “as soon as possible”. Theoretical frameworks for the performance analysis of detection procedures, as well as results of Monte Carlo simulations for a Poisson example and results of detecting real flooding attacks, are presented.  相似文献   
63.
Previous experimental studies on tax behavior have been particularly concerned with determining the absolute effect of detection rate and punishment on tax filing, leading to mixed results. In this paper, we shed some additional light on the effectiveness of audit probability and sanctions by drawing upon a dynamic setting with particular focus on the time lag between audits. Our results showed that tax compliance decreased immediately after a random audit, suggesting that subjects were prone to misperception of chance. Sanctions decreased compliance to a lesser extent; they were, however, associated with the tendency of subjects to repair their losses by increasing their capital stock.  相似文献   
64.
65.
We propose an evidence synthesis approach through a degradation model to estimate causal influences of physiological factors on myocardial infarction (MI) and coronary heart disease (CHD). For instance several studies give incidences of MI and CHD for different age strata, other studies give relative or absolute risks for strata of main risk factors of MI or CHD. Evidence synthesis of several studies allows incorporating these disparate pieces of information into a single model. For doing this we need to develop a sufficiently general dynamical model; we also need to estimate the distribution of explanatory factors in the population. We develop a degradation model for both MI and CHD using a Brownian motion with drift, and the drift is modeled as a function of indicators of obesity, lipid profile, inflammation and blood pressure. Conditionally on these factors the times to MI or CHD have inverse Gaussian ( ${\mathcal{IG}}$ ) distributions. The results we want to fit are generally not conditional on all the factors and thus we need marginal distributions of the time of occurrence of MI and CHD; this leads us to manipulate the inverse Gaussian normal distribution ( ${\mathcal{IGN}}$ ) (an ${\mathcal{IG}}$ whose drift parameter has a normal distribution). Another possible model arises if a factor modifies the threshold. This led us to define an extension of ${\mathcal{IGN}}$ obtained when both drift and threshold parameters have normal distributions. We applied the model to results published in five important studies of MI and CHD and their risk factors. The fit of the model using the evidence synthesis approach was satisfactory and the effects of the four risk factors were highly significant.  相似文献   
66.
The spectral measure plays a key role in the statistical modeling of multivariate extremes. Estimation of the spectral measure is a complex issue, given the need to obey a certain moment condition. We propose a Euclidean likelihood-based estimator for the spectral measure which is simple and explicitly defined, with its expression being free of Lagrange multipliers. Our estimator is shown to have the same limit distribution as the maximum empirical likelihood estimator of Einmahl and Segers (2009 Einmahl , J. H. J. , Segers , J. ( 2009 ). Maximum empirical likelihood estimation of the spectral measure of an extreme-value distribution . Ann. Statist. 37 ( 5B ): 29532989 .[Crossref], [Web of Science ®] [Google Scholar]). Numerical experiments suggest an overall good performance and identical behavior to the maximum empirical likelihood estimator. We illustrate the method in an extreme temperature data analysis.  相似文献   
67.
The probability laws describing the numbers of oscillations of the continuous symmetric random walk express themselves through the numbers of Stirling of first kind. The asymptotic behavior can easily be derived using characteristic functions.  相似文献   
68.
69.
It is vital for insurance companies to have appropriate levels of loss reserving to pay outstanding claims and related settlement costs. With many uncertainties and time lags inherently involved in the claims settlement process, loss reserving therefore must be based on estimates. Existing models and methods cannot cope with irregular and extreme claims and hence do not offer an accurate prediction of loss reserving. This paper extends the conventional normal error distribution in loss reserving modeling to a range of heavy-tailed distributions which are expressed by certain scale mixtures forms. This extension enables robust analysis and, in addition, allows an efficient implementation of Bayesian analysis via Markov chain Monte Carlo simulations. Various models for the mean of the sampling distributions, including the log-Analysis of Variance (ANOVA), log-Analysis of Covariance (ANCOVA) and state space models, are considered and the straightforward implementation of scale mixtures distributions is demonstrated using OpenBUGS.  相似文献   
70.
This paper uses the theoretical lens of object relations and takes the position that the paranoid-schizoid position and the related mechanism of projective identification—cornerstone pathology found in the phenomenon of fascism and totalitarianism—are not atypical, but rather live within the seemingly normal individual. Using Rwanda as a case example, the author illustrates the continuum of beliefs and actions that can result in genocide, and then describes some of the treatment considerations facing the clinician dealing with victims of genocide.
Boris ThomasEmail:
  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号