全文获取类型
收费全文 | 12130篇 |
免费 | 75篇 |
国内免费 | 16篇 |
专业分类
管理学 | 1590篇 |
劳动科学 | 5篇 |
民族学 | 166篇 |
人才学 | 1篇 |
人口学 | 2430篇 |
丛书文集 | 376篇 |
理论方法论 | 567篇 |
综合类 | 1253篇 |
社会学 | 4508篇 |
统计学 | 1325篇 |
出版年
2024年 | 2篇 |
2023年 | 12篇 |
2022年 | 44篇 |
2021年 | 43篇 |
2020年 | 23篇 |
2019年 | 20篇 |
2018年 | 1686篇 |
2017年 | 1704篇 |
2016年 | 1101篇 |
2015年 | 97篇 |
2014年 | 118篇 |
2013年 | 130篇 |
2012年 | 402篇 |
2011年 | 1264篇 |
2010年 | 1171篇 |
2009年 | 918篇 |
2008年 | 922篇 |
2007年 | 1115篇 |
2006年 | 108篇 |
2005年 | 302篇 |
2004年 | 315篇 |
2003年 | 287篇 |
2002年 | 170篇 |
2001年 | 78篇 |
2000年 | 51篇 |
1999年 | 28篇 |
1998年 | 15篇 |
1997年 | 10篇 |
1996年 | 38篇 |
1995年 | 15篇 |
1994年 | 8篇 |
1993年 | 2篇 |
1992年 | 7篇 |
1989年 | 2篇 |
1988年 | 10篇 |
1987年 | 1篇 |
1984年 | 1篇 |
1981年 | 1篇 |
排序方式: 共有10000条查询结果,搜索用时 11 毫秒
981.
C. B. García J. García Pérez J. R. van Dorp 《Statistical Methods and Applications》2011,20(4):463-486
A prevalence of heavy-tailed, peaked and skewed uncertainty phenomena have been cited in literature dealing with economic, physics, and engineering data. This fact has invigorated the search for continuous distributions of this nature. In this paper we shall generalize the two-sided framework presented in Kotz and van Dorp (Beyond beta: other continuous families of distributions with bounded support and applications. World Scientific Press, Singapore, 2004) for the construction of families of distributions with bounded support via a mixture technique utilizing two generating densities instead of one. The family of Elevated Two-Sided Power (ETSP) distributions is studied as an instance of this generalized framework. Through a moment ratio diagram comparison, we demonstrate that the ETSP family allows for a remarkable flexibility when modeling heavy-tailed and peaked, but skewed, uncertainty phenomena. We shall demonstrate its applicability via an illustrative example utilizing 2008 US income data. 相似文献
982.
Francesca Greselin Salvatore Ingrassia Antonio Punzo 《Statistical Methods and Applications》2011,20(2):141-170
This paper extends the scedasticity comparison among several groups of observations, usually complying with the homoscedastic and the heteroscedastic cases, in order to deal with data sets laying in an intermediate situation. As is well known, homoscedasticity corresponds to equality in orientation, shape and size of the group scatters. Here our attention is focused on two weaker requirements: scatters with the same orientation, but with different shape and size, or scatters with the same shape and size but different orientation. We introduce a multiple testing procedure that takes into account each of the above conditions. This approach discloses a richer information on the data underlying structure than the classical method only based on homo/heteroscedasticity. At the same time, it allows a more parsimonious parametrization, whenever the patterned model is appropriate to describe the real data. The new inferential methodology is then applied to some well-known data sets, chosen in the multivariate literature, to show the real gain in using this more informative approach. Finally, a wide simulation study illustrates and compares the performance of the proposal using data sets with gradual departure from homoscedasticity. 相似文献
983.
Yves F. Atchadé 《Statistics and Computing》2011,21(4):463-473
In empirical Bayes inference one is typically interested in sampling from the posterior distribution of a parameter with a
hyper-parameter set to its maximum likelihood estimate. This is often problematic particularly when the likelihood function
of the hyper-parameter is not available in closed form and the posterior distribution is intractable. Previous works have
dealt with this problem using a multi-step approach based on the EM algorithm and Markov Chain Monte Carlo (MCMC). We propose
a framework based on recent developments in adaptive MCMC, where this problem is addressed more efficiently using a single
Monte Carlo run. We discuss the convergence of the algorithm and its connection with the EM algorithm. We apply our algorithm
to the Bayesian Lasso of Park and Casella (J. Am. Stat. Assoc. 103:681–686, 2008) and on the empirical Bayes variable selection of George and Foster (J. Am. Stat. Assoc. 87:731–747, 2000). 相似文献
984.
This paper proposes a functional connectivity approach, inspired by brain imaging literature, to model cross-sectional dependence. Using a varying parameter framework, the model allows correlation patterns to arise from complex economic or social relations rather than being simply functions of economic or geographic distances between locations. It nests the conventional spatial and factor model approaches as special cases. A Bayesian Markov Chain Monte Carlo method implements this approach. A small scale Monte Carlo study is conducted to evaluate the performance of this approach in finite samples, which outperforms both a spatial model and a factor model. We apply the functional connectivity approach to estimate a hedonic housing price model for Paris using housing transactions over the period 1990–2003. It allows us to get more information about complex spatial connections and appears more suitable to capture the cross-sectional dependence than the conventional methods. 相似文献
985.
The control and treatment of dyslipidemia is a major public health challenge, particularly for patients with coronary heart
diseases. In this paper we propose a framework for survival analysis of patients who had a major cardiac event, focusing on
assessment of the effect of changing LDL-cholesterol level and statins consumption on survival. This framework includes a
Cox PH model and a Markov chain, and combines their results into reinforced conclusions regarding the factors that affect
survival time. We prospectively studied 2,277 cardiac patients, and the results show high congruence between the Markov model
and the PH model; both evidence that diabetes, history of stroke, peripheral vascular disease and smoking significantly increase
hazard rate and reduce survival time. On the other hand, statin consumption is correlated with a lower hazard rate and longer
survival time in both models. The role of such a framework in understanding the therapeutic behavior of patients and implementing
effective secondary and primary prevention of heart diseases is discussed here. 相似文献
986.
In this paper we introduce a new extension for the Birnbaum–Saunder distribution based on the family of the epsilon-skew-symmetric distributions studied in Arellano-Valle et al. (J Stat Plan Inference 128(2):427–443, 2005). The extension allows generating Birnbaun–Saunders type distributions able to deal with extreme or outlying observations (Dupuis and Mills, IEEE Trans Reliab 47:88–95, 1998). Basic properties such as moments and Fisher information matrix are also studied. Results of a real data application are reported illustrating good fitting properties of the proposed model. 相似文献
987.
A doubly censoring scheme occurs when the lifetimes T being measured, from a well-known time origin, are exactly observed within a window [L, R] of observational time and are otherwise censored either from above (right-censored observations) or below (left-censored
observations). Sample data consists on the pairs (U, δ) where U = min{R, max{T, L}} and δ indicates whether T is exactly observed (δ = 0), right-censored (δ = 1) or left-censored (δ = −1). We are interested in the estimation of the marginal behaviour of the three random variables T, L and R based on the observed pairs (U, δ). We propose new nonparametric simultaneous marginal estimators [^(S)]T, [^(S)]L{\hat S_{T}, \hat S_{L}} and [^(S)]R{\hat S_{R}} for the survival functions of T, L and R, respectively, by means of an inverse-probability-of-censoring approach. The proposed estimators [^(S)]T, [^(S)]L{\hat S_{T}, \hat S_{L}} and [^(S)]R{\hat S_{R}} are not computationally intensive, generalize the empirical survival estimator and reduce to the Kaplan-Meier estimator in
the absence of left-censored data. Furthermore, [^(S)]T{\hat S_{T}} is equivalent to a self-consistent estimator, is uniformly strongly consistent and asymptotically normal. The method is illustrated
with data from a cohort of drug users recruited in a detoxification program in Badalona (Spain). For these data we estimate
the survival function for the elapsed time from starting IV-drugs to AIDS diagnosis, as well as the potential follow-up time.
A simulation study is discussed to assess the performance of the three survival estimators for moderate sample sizes and different
censoring levels. 相似文献
988.
989.
Powerful entropy-based tests for normality, uniformity and exponentiality have been well addressed in the statistical literature.
The density-based empirical likelihood approach improves the performance of these tests for goodness-of-fit, forming them
into approximate likelihood ratios. This method is extended to develop two-sample empirical likelihood approximations to optimal
parametric likelihood ratios, resulting in an efficient test based on samples entropy. The proposed and examined distribution-free
two-sample test is shown to be very competitive with well-known nonparametric tests. For example, the new test has high and
stable power detecting a nonconstant shift in the two-sample problem, when Wilcoxon’s test may break down completely. This
is partly due to the inherent structure developed within Neyman-Pearson type lemmas. The outputs of an extensive Monte Carlo
analysis and real data example support our theoretical results. The Monte Carlo simulation study indicates that the proposed
test compares favorably with the standard procedures, for a wide range of null and alternative distributions. 相似文献
990.
This note provides the asymptotic distribution of a Perron-type innovational outlier unit root test developed by Popp (J Stat
Comput Sim 78:1145–1161, 2008) in case of a shift in the intercept for non-trending data. In Popp (J Stat Comput Sim 78:1145–1161,
2008), only critical values for finite samples based on Monte Carlo techniques are tabulated. Using similar arguments as in
Zivot and Andrews (J Bus Econ Stat 10:251–270, 1992), weak convergence is shown for the test statistics. 相似文献