首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   23691篇
  免费   749篇
  国内免费   293篇
管理学   1994篇
劳动科学   4篇
民族学   107篇
人才学   4篇
人口学   501篇
丛书文集   1470篇
理论方法论   810篇
综合类   11244篇
社会学   1405篇
统计学   7194篇
  2024年   33篇
  2023年   190篇
  2022年   286篇
  2021年   348篇
  2020年   518篇
  2019年   655篇
  2018年   709篇
  2017年   924篇
  2016年   728篇
  2015年   823篇
  2014年   1706篇
  2013年   3717篇
  2012年   2148篇
  2011年   1624篇
  2010年   1452篇
  2009年   1306篇
  2008年   1314篇
  2007年   1012篇
  2006年   899篇
  2005年   798篇
  2004年   677篇
  2003年   619篇
  2002年   538篇
  2001年   426篇
  2000年   282篇
  1999年   199篇
  1998年   114篇
  1997年   120篇
  1996年   85篇
  1995年   78篇
  1994年   55篇
  1993年   53篇
  1992年   45篇
  1991年   46篇
  1990年   33篇
  1989年   25篇
  1988年   22篇
  1987年   9篇
  1986年   11篇
  1985年   22篇
  1984年   17篇
  1983年   17篇
  1982年   10篇
  1981年   11篇
  1980年   6篇
  1979年   13篇
  1978年   5篇
  1977年   2篇
  1976年   1篇
  1975年   2篇
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
11.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.  相似文献   
12.
Because of the inherent complexity of biological systems, there is often a choice between a number of apparently equally applicable physiologically based models to describe uptake and metabolism processes in toxicology or risk assessment. These models may fit the particular data sets of interest equally well, but may give quite different parameter estimates or predictions under different (extrapolated) conditions. Such competing models can be discriminated by a number of methods, including potential refutation by means of strategic experiments, and their ability to suitably incorporate all relevant physiological processes. For illustration, three currently used models for steady-state hepatic elimination--the venous equilibration model, the parallel tube model, and the distributed sinusoidal perfusion model--are reviewed and compared with particular reference to their application in the area of risk assessment. The ability of each of the models to describe and incorporate such physiological processes as protein binding, precursor-metabolite relations and hepatic zones of elimination, capillary recruitment, capillary heterogeneity, and intrahepatic shunting is discussed. Differences between the models in hepatic parameter estimation, extrapolation to different conditions, and interspecies scaling are discussed, and criteria for choosing one model over the others are presented. In this case, the distributed model provides the most general framework for describing physiological processes taking place in the liver, and has so far not been experimentally refuted, as have the other two models. These simpler models may, however, provide useful bounds on parameter estimates and on extrapolations and risk assessments.  相似文献   
13.
Data from the 1988 National Survey on Families and Households were analyzed to examine the associations among marital conflict, ineffective parenting, and children's and adolescents' maladjustment. Parents' use of harsh discipline and low parental involvement helped explain the connection between marital conflict and children's maladjustment in children aged 2 through 11. Parent‐child conflict was measured only in families with a target teenager and also was a significant mediator. Although ineffective parenting explained part of the association between marital conflict and children's maladjustment, independent effects of marital conflict remained in families with target children aged 2 through 11 (but not for families with a teenager). With a few exceptions, this pattern of findings was consistent for mothers' and fathers' reports, for daughters and sons, for families with various ethnic backgrounds, and for families living in and out of poverty.  相似文献   
14.
This note exhibits two independent random variables on integers, X1 and X2, such that neither X1 nor X2 has a generalized Poisson distribution, but X1 + X2 has. This contradicts statements made by Professor Consul in his recent book.  相似文献   
15.
The authors consider the optimal design of sampling schedules for binary sequence data. They propose an approach which allows a variety of goals to be reflected in the utility function by including deterministic sampling cost, a term related to prediction, and if relevant, a term related to learning about a treatment effect To this end, they use a nonparametric probability model relying on a minimal number of assumptions. They show how their assumption of partial exchangeability for the binary sequence of data allows the sampling distribution to be written as a mixture of homogeneous Markov chains of order k. The implementation follows the approach of Quintana & Müller (2004), which uses a Dirichlet process prior for the mixture.  相似文献   
16.
Summary.  In New Testament studies, the synoptic problem is concerned with the relationships between the gospels of Matthew, Mark and Luke. In an earlier paper a careful specification in probabilistic terms was set up of Honoré's triple-link model. In the present paper, a modification of Honoré's model is proposed. As previously, counts of the numbers of verbal agreements between the gospels are examined to investigate which of the possible triple-link models appears to give the best fit to the data, but now using the modified version of the model and additional sets of data.  相似文献   
17.
The authors propose graphical and numerical methods for checking the adequacy of the logistic regression model for matched case‐control data. Their approach is based on the cumulative sum of residuals over the covariate or linear predictor. Under the assumed model, the cumulative residual process converges weakly to a centered Gaussian limit whose distribution can be approximated via computer simulation. The observed cumulative residual pattern can then be compared both visually and analytically to a certain number of simulated realizations of the approximate limiting process under the null hypothesis. The proposed techniques allow one to check the functional form of each covariate, the logistic link function as well as the overall model adequacy. The authors assess the performance of the proposed methods through simulation studies and illustrate them using data from a cardiovascular study.  相似文献   
18.
We consider acyclic directed mixed graphs, in which directed edges ( x → y ) and bi-directed edges ( x ↔ y ) may occur. A simple extension of Pearl's d -separation criterion, called m -separation, is applied to these graphs. We introduce a local Markov property which is equivalent to the global property resulting from the m -separation criterion for arbitrary distributions.  相似文献   
19.
We propose four different GMM estimators that allow almost consistent estimation of the structural parameters of panel probit models with fixed effects for the case of small Tand large N. The moments used are derived for each period from a first order approximation of the mean of the dependent variable conditional on explanatory variables and on the fixed effect. The estimators differ w.r.t. the choice of instruments and whether they use trimming to reduce the bias or not. In a Monte Carlo study, we compare these estimators with pooled probit and conditional logit estimators for different data generating processes. The results show that the proposed estimators outperform these competitors in several situations.  相似文献   
20.
The authors extend the block external bootstrap to partially linear regression models with strongly mixing, nonstationary error terms. In addition to providing an approximate distribution for the semiparametric least square estimator of the parametric component, they propose a consistent estimator of the co‐variance matrix of this estimator.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号