首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   20455篇
  免费   803篇
  国内免费   238篇
管理学   2021篇
劳动科学   3篇
民族学   104篇
人才学   3篇
人口学   352篇
丛书文集   1232篇
理论方法论   497篇
综合类   9990篇
社会学   657篇
统计学   6637篇
  2024年   34篇
  2023年   152篇
  2022年   246篇
  2021年   286篇
  2020年   403篇
  2019年   521篇
  2018年   602篇
  2017年   801篇
  2016年   640篇
  2015年   680篇
  2014年   1119篇
  2013年   2758篇
  2012年   1509篇
  2011年   1303篇
  2010年   1088篇
  2009年   1028篇
  2008年   1117篇
  2007年   1154篇
  2006年   1070篇
  2005年   926篇
  2004年   801篇
  2003年   687篇
  2002年   607篇
  2001年   501篇
  2000年   336篇
  1999年   235篇
  1998年   135篇
  1997年   136篇
  1996年   107篇
  1995年   96篇
  1994年   71篇
  1993年   59篇
  1992年   60篇
  1991年   54篇
  1990年   30篇
  1989年   26篇
  1988年   23篇
  1987年   15篇
  1986年   10篇
  1985年   16篇
  1984年   16篇
  1983年   12篇
  1982年   7篇
  1981年   3篇
  1980年   3篇
  1979年   7篇
  1978年   2篇
  1977年   2篇
  1976年   1篇
  1975年   1篇
排序方式: 共有10000条查询结果,搜索用时 93 毫秒
371.
Varying-coefficient models are very useful for longitudinal data analysis. In this paper, we focus on varying-coefficient models for longitudinal data. We develop a new estimation procedure using Cholesky decomposition and profile least squares techniques. Asymptotic normality for the proposed estimators of varying-coefficient functions has been established. Monte Carlo simulation studies show excellent finite-sample performance. We illustrate our methods with a real data example.  相似文献   
372.
As flood risks grow worldwide, a well‐designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood‐loss‐sharing program involving private insurance based on location‐specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS‐based flood model and a stochastic optimization procedure with respect to location‐specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile‐related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures.  相似文献   
373.
Extending previous work on hedge fund return predictability, this paper introduces the idea of modelling the conditional distribution of hedge fund returns using Student's t full-factor multivariate GARCH models. This class of models takes into account the stylized facts of hedge fund return series, that is, heteroskedasticity, fat tails and deviations from normality. For the proposed class of multivariate predictive regression models, we derive analytic expressions for the score and the Hessian matrix, which can be used within classical and Bayesian inferential procedures to estimate the model parameters, as well as to compare different predictive regression models. We propose a Bayesian approach to model comparison which provides posterior probabilities for various predictive models that can be used for model averaging. Our empirical application indicates that accounting for fat tails and time-varying covariances/correlations provides a more appropriate modelling approach of the underlying dynamics of financial series and improves our ability to predict hedge fund returns.  相似文献   
374.
Empirical Bayes estimates of the local false discovery rate can reflect uncertainty about the estimated prior by supplementing their Bayesian posterior probabilities with confidence levels as posterior probabilities. This use of coherent fiducial inference with hierarchical models generates set estimators that propagate uncertainty to varying degrees. Some of the set estimates approach estimates from plug-in empirical Bayes methods for high numbers of comparisons and can come close to the usual confidence sets given a sufficiently low number of comparisons.  相似文献   
375.
In chemical and microbial risk assessments, risk assessors fit dose‐response models to high‐dose data and extrapolate downward to risk levels in the range of 1–10%. Although multiple dose‐response models may be able to fit the data adequately in the experimental range, the estimated effective dose (ED) corresponding to an extremely small risk can be substantially different from model to model. In this respect, model averaging (MA) provides more robustness than a single dose‐response model in the point and interval estimation of an ED. In MA, accounting for both data uncertainty and model uncertainty is crucial, but addressing model uncertainty is not achieved simply by increasing the number of models in a model space. A plausible set of models for MA can be characterized by goodness of fit and diversity surrounding the truth. We propose a diversity index (DI) to balance between these two characteristics in model space selection. It addresses a collective property of a model space rather than individual performance of each model. Tuning parameters in the DI control the size of the model space for MA.  相似文献   
376.
There is currently much discussion about lasso-type regularized regression which is a useful tool for simultaneous estimation and variable selection. Although the lasso-type regularization has several advantages in regression modelling, owing to its sparsity, it suffers from outliers because of using penalized least-squares methods. To overcome this issue, we propose a robust lasso-type estimation procedure that uses the robust criteria as the loss function, imposing L1-type penalty called the elastic net. We also introduce to use the efficient bootstrap information criteria for choosing optimal regularization parameters and a constant in outlier detection. Simulation studies and real data analysis are given to examine the efficiency of the proposed robust sparse regression modelling. We observe that our modelling strategy performs well in the presence of outliers.  相似文献   
377.
We consider the problem of modelling a long-memory time series using piecewise fractional autoregressive integrated moving average processes. The number as well as the locations of structural break points (BPs) and the parameters of each regime are assumed to be unknown. A four-step procedure is proposed to find out the BPs and to estimate the parameters of each regime. Its effectiveness is shown by Monte Carlo simulations and an application to real traffic data modelling is considered.  相似文献   
378.
379.
"资源稀释模型"认为,在家庭经济资源一定的情况下,家庭里的孩子数量越多,每个孩子分到的经济资源就会相应减少;而且家庭的经济资源对孩子是否有机会接受高等教育的影响非常重要。笔者从"资源稀释模型(Resource Dilution Model)"这一视角来分析和探讨,上世纪末我国"高等教育规模扩张"政策何以得以顺利实施。  相似文献   
380.
A novel method was used to incorporate in vivo host–pathogen dynamics into a new robust outbreak model for legionellosis. Dose‐response and time‐dose‐response (TDR) models were generated for Legionella longbeachae exposure to mice via the intratracheal route using a maximum likelihood estimation approach. The best‐fit TDR model was then incorporated into two L. pneumophila outbreak models: an outbreak that occurred at a spa in Japan, and one that occurred in a Melbourne aquarium. The best‐fit TDR from the murine dosing study was the beta‐Poisson with exponential‐reciprocal dependency model, which had a minimized deviance of 32.9. This model was tested against other incubation distributions in the Japan outbreak, and performed consistently well, with reported deviances ranging from 32 to 35. In the case of the Melbourne outbreak, the exponential model with exponential dependency was tested against non‐time‐dependent distributions to explore the performance of the time‐dependent model with the lowest number of parameters. This model reported low minimized deviances around 8 for the Weibull, gamma, and lognormal exposure distribution cases. This work shows that the incorporation of a time factor into outbreak distributions provides models with acceptable fits that can provide insight into the in vivo dynamics of the host‐pathogen system.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号