首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   11034篇
  免费   357篇
  国内免费   139篇
管理学   755篇
劳动科学   35篇
民族学   263篇
人才学   8篇
人口学   246篇
丛书文集   2916篇
理论方法论   473篇
综合类   5042篇
社会学   794篇
统计学   998篇
  2024年   12篇
  2023年   55篇
  2022年   183篇
  2021年   189篇
  2020年   133篇
  2019年   134篇
  2018年   192篇
  2017年   273篇
  2016年   194篇
  2015年   366篇
  2014年   388篇
  2013年   690篇
  2012年   656篇
  2011年   763篇
  2010年   836篇
  2009年   797篇
  2008年   777篇
  2007年   790篇
  2006年   787篇
  2005年   686篇
  2004年   371篇
  2003年   282篇
  2002年   362篇
  2001年   305篇
  2000年   203篇
  1999年   218篇
  1998年   131篇
  1997年   134篇
  1996年   128篇
  1995年   120篇
  1994年   72篇
  1993年   66篇
  1992年   38篇
  1991年   39篇
  1990年   33篇
  1989年   34篇
  1988年   23篇
  1987年   14篇
  1986年   13篇
  1985年   15篇
  1984年   5篇
  1983年   5篇
  1981年   4篇
  1980年   4篇
  1978年   2篇
  1977年   1篇
  1976年   3篇
  1974年   1篇
  1973年   1篇
  1970年   1篇
排序方式: 共有10000条查询结果,搜索用时 31 毫秒
51.
The authors derive the analytic expressions for the mean and variance of the log-likelihood ratio for testing equality of k (k ≥ 2) normal populations, and suggest a chi-square approximation and a gamma approximation to the exact null distribution. Numerical comparisons show that the two approximations and the original beta approximation of Neyman and Pearson (1931 Neyman , J. , Pearson , E. S. ( 1931 ). On the problem of k samples . In: Neyman , J. , Pearson , E. S. , eds. Joint Statistical Papers . Cambridge : Cambridge University Press , pp. 116131 . [Google Scholar]) are all accurate, and the gamma approximation is the most accurate.  相似文献   
52.
53.
Abstract

This study analyzes more than 400 SFX broken-link reports sent by users of an academic library. It raises technical issues regarding OpenURL linking in the handling of special journal volume and issue numbers, journal supplemental issues, embargo release dates, book reviews, DOIs, and other areas. It reports on full-text resources with the most broken links, causes of broken links, and the library's responses to users. It also explores how journal publishers, database vendors, and OpenURL vendors can improve the quality of their products and how librarians can better serve users.  相似文献   
54.
We consider asymmetric kernel estimates based on grouped data. We propose an iterated scheme for constructing such an estimator and apply an iterated smoothed bootstrap approach for bandwidth selection. We compare our approach with competing methods in estimating actuarial loss models using both simulations and data studies. The simulation results show that with this new method, the estimated density from grouped data matches the true density more closely than with competing approaches.  相似文献   
55.
Methods for a sequential test of a dose-response effect in pre-clinical studies are investigated. The objective of the test procedure is to compare several dose groups with a zero-dose control. The sequential testing is conducted within a closed family of one-sided tests. The procedures investigated are based on a monotonicity assumption. These closed procedures strongly control the familywise error rate while providing information about the shape of the dose-responce relationship. Performance of sequential testing procedures are compared via a Monte Carlo simulation study. We illustrae the procedures by application to a real data set.  相似文献   
56.
Following the extension from linear mixed models to additive mixed models, extension from generalized linear mixed models to generalized additive mixed models is made, Algorithms are developed to compute the MLE's of the nonlinear effects and the covariance structures based on the penalized marginal likelihood. Convergence of the algorithms and selection of the smooth param¬eters are discussed.  相似文献   
57.
ABSTRACT

This article investigates a quasi-maximum exponential likelihood estimator(QMELE) for a non stationary generalized autoregressive conditional heteroscedastic (GARCH(1,1)) model. Asymptotic normality of this estimator is derived under a non stationary condition. A simulation study and a real example are given to evaluate the performance of QMELE for this model.  相似文献   
58.
ABSTRACT

Holm's step-down testing procedure starts with the smallest p-value and sequentially screens larger p-values without any information on confidence intervals. This article changes the conventional step-down testing framework by presenting a nonparametric procedure that starts with the largest p-value and sequentially screens smaller p-values in a step-by-step manner to construct a set of simultaneous confidence sets. We use a partitioning approach to prove that the new procedure controls the simultaneous confidence level (thus strongly controlling the familywise error rate). Discernible features of the new stepwise procedure include consistency with individual inference, coherence, and confidence estimations for follow-up investigations. In a simple simulation study, the proposed procedure (treated as a testing procedure), is more powerful than Holm's procedure when the correlation coefficient is large, and vice versa when it is small. In the data analysis of a medical study, the new procedure is able to detect the efficacy of Aspirin as a cardiovascular prophylaxis in a nonparametric setting.  相似文献   
59.
Abstract

In this paper, we discuss how to model the mean and covariancestructures in linear mixed models (LMMs) simultaneously. We propose a data-driven method to modelcovariance structures of the random effects and random errors in the LMMs. Parameter estimation in the mean and covariances is considered by using EM algorithm, and standard errors of the parameter estimates are calculated through Louis’ (1982 Louis, T.A. (1982). Finding observed information using the EM algorithm. J. Royal Stat. Soc. B 44:98130. [Google Scholar]) information principle. Kenward’s (1987 Kenward, M.G. (1987). A method for comparing profiles of repeated measurements. Appl. Stat. 36:296308.[Crossref], [Web of Science ®] [Google Scholar]) cattle data sets are analyzed for illustration,and comparison to the literature work is made through simulation studies. Our numerical analysis confirms the superiority of the proposed method to existing approaches in terms of Akaike information criterion.  相似文献   
60.
Abstract

This paper is devoted to the study of a risk-based optimal investment and proportional reinsurance problem. The surplus process of the insurer and the risky asset process in the financial market are assumed to be general jump-diffusion processes. We use a convex risk measure generated by g-expectation to describe the risk of the terminal wealth with investment and reinsurance. Under the aim of minimizing the risk, the problem is solved by using techniques of stochastic maximum principles. Two interesting special cases are studied and the explicit expressions for optimal strategies and corresponding minimal risks are derived.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号