首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2618篇
  免费   64篇
  国内免费   9篇
管理学   119篇
民族学   3篇
人口学   22篇
丛书文集   27篇
理论方法论   23篇
综合类   449篇
社会学   32篇
统计学   2016篇
  2024年   1篇
  2023年   16篇
  2022年   12篇
  2021年   20篇
  2020年   41篇
  2019年   80篇
  2018年   84篇
  2017年   150篇
  2016年   69篇
  2015年   67篇
  2014年   84篇
  2013年   778篇
  2012年   242篇
  2011年   61篇
  2010年   84篇
  2009年   83篇
  2008年   78篇
  2007年   67篇
  2006年   69篇
  2005年   78篇
  2004年   47篇
  2003年   59篇
  2002年   45篇
  2001年   45篇
  2000年   46篇
  1999年   48篇
  1998年   44篇
  1997年   26篇
  1996年   24篇
  1995年   24篇
  1994年   16篇
  1993年   13篇
  1992年   14篇
  1991年   12篇
  1990年   14篇
  1989年   8篇
  1988年   7篇
  1987年   2篇
  1986年   5篇
  1985年   5篇
  1984年   3篇
  1983年   6篇
  1981年   4篇
  1980年   5篇
  1979年   2篇
  1978年   2篇
  1977年   1篇
排序方式: 共有2691条查询结果,搜索用时 15 毫秒
1.
The product of two independent or dependent scalar normal variables, sums of products, sample covariances, and general bilinear forms are considered. Their distributions are shown to belong to a class called generalized Laplacian. A growth-decay mechanism is also shown to produce such a generalized Laplacian. Sets of necessary and sufficient conditions are derived for bilinear forms to belong to this class. As a generalization, the distributions of rectangular matrices associated with multivariate normal random vectors are also discussed.  相似文献   
2.
Complete and partial diallel cross designs are examined as to their construction and robustness against the loss of a block of observations. A simple generalized inverse is found for the information matrix of the line effects, which allows evaluation of expressions for the variances of the line-effect differences with and without the missing block. A-efficiencies, based on average variances of the elementary contrasts of the line-effects, suggest that these designs are fairly robust. The loss of efficiency is generally less than 10%, but it is shown that specific comparisons might suffer a loss of efficiency of as much as 40%.  相似文献   
3.
Many applications of nonparametric tests based on curve estimation involve selecting a smoothing parameter. The author proposes an adaptive test that combines several generalized likelihood ratio tests in order to get power performance nearly equal to whichever of the component tests is best. She derives the asymptotic joint distribution of the component tests and that of the proposed test under the null hypothesis. She also develops a simple method of selecting the smoothing parameters for the proposed test and presents two approximate methods for obtaining its P‐value. Finally, she evaluates the proposed test through simulations and illustrates its application to a set of real data.  相似文献   
4.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.  相似文献   
5.
While most of epidemiology is observational, rather than experimental, the culture of epidemiology is still derived from agricultural experiments, rather than other observational fields, such as astronomy or economics. The mismatch is made greater as focus has turned to continue risk factors, multifactorial outcomes, and outcomes with large variation unexplainable by available risk factors. The analysis of such data is often viewed as hypothesis testing with statistical control replacing randomization. However, such approaches often test restricted forms of the hypothesis being investigated, such as the hypothesis of a linear association, when there is no prior empirical or theoretical reason to believe that if an association exists, it is linear. In combination with the large nonstochastic sources of error in such observational studies, this suggests the more flexible alternative of exploring the association. Conclusions on the possible causal nature of any discovered association will rest on the coherence and consistency of multiple studies. Nonparametric smoothing in general, and generalized additive models in particular, represent an attractive approach to such problems. This is illustrated using data examining the relationship between particulate air pollution and daily mortality in Birmingham, Alabama; between particulate air pollution, ozone, and SO2 and daily hospital admissions for respiratory illness in Philadelphia; and between ozone and particulate air pollution and coughing episodes in children in six eastern U.S. cities. The results indicate that airborne particles and ozone are associated with adverse health outcomes at very low concentrations, and that there are likely no thresholds for these relationships.  相似文献   
6.
This note exhibits two independent random variables on integers, X1 and X2, such that neither X1 nor X2 has a generalized Poisson distribution, but X1 + X2 has. This contradicts statements made by Professor Consul in his recent book.  相似文献   
7.
To reduce nonresponse bias in sample surveys, a method of nonresponse weighting adjustment is often used which consists of multiplying the sampling weight of the respondent by the inverse of the estimated response probability. The authors examine the asymptotic properties of this estimator. They prove that it is generally more efficient than an estimator which uses the true response probability, provided that the parameters which govern this probability are estimated by maximum likelihood. The authors discuss variance estimation methods that account for the effect of using the estimated response probability; they compare their performances in a small simulation study. They also discuss extensions to the regression estimator.  相似文献   
8.
交互式多媒体呈现系统一般由呈现内容和呈现控制流程两部分组成。通过引入通用转换网络,对交互式多媒体应用系统的内部呈现流程进行了形式化描述;讨论了该形式化描述下呈现流程的重用性;提出了将呈现流程与呈现内容相分离的系统实现策略;给出了呈现内容描述和可呈现对象相结合的呈现内容表示方法。该策略和方法可用于设计实现交互式多媒体呈现系统创作平台,利用该平台创作的呈现系统能够并发地呈现多种媒体形式的信息,系统及其组成部件具有良好的可重用性。  相似文献   
9.
The quantification of the relationship between the amount of microbial organisms ingested and a specific outcome such as infection, illness, or mortality is a key aspect of quantitative risk assessment. A main problem in determining such dose-response models is the availability of appropriate data. Human feeding trials have been criticized because only young healthy volunteers are selected to participate and low doses, as often occurring in real life, are typically not considered. Epidemiological outbreak data are considered to be more valuable, but are more subject to data uncertainty. In this article, we model the dose-illness relationship based on data of 20 Salmonella outbreaks, as discussed by the World Health Organization. In particular, we model the dose-illness relationship using generalized linear mixed models and fractional polynomials of dose. The fractional polynomial models are modified to satisfy the properties of different types of dose-illness models as proposed by Teunis et al . Within these models, differences in host susceptibility (susceptible versus normal population) are modeled as fixed effects whereas differences in serovar type and food matrix are modeled as random effects. In addition, two bootstrap procedures are presented. A first procedure accounts for stochastic variability whereas a second procedure accounts for both stochastic variability and data uncertainty. The analyses indicate that the susceptible population has a higher probability of illness at low dose levels when the combination pathogen-food matrix is extremely virulent and at high dose levels when the combination is less virulent. Furthermore, the analyses suggest that immunity exists in the normal population but not in the susceptible population.  相似文献   
10.
The benchmark dose (BMD) is an exposure level that would induce a small risk increase (BMR level) above the background. The BMD approach to deriving a reference dose for risk assessment of noncancer effects is advantageous in that the estimate of BMD is not restricted to experimental doses and utilizes most available dose-response information. To quantify statistical uncertainty of a BMD estimate, we often calculate and report its lower confidence limit (i.e., BMDL), and may even consider it as a more conservative alternative to BMD itself. Computation of BMDL may involve normal confidence limits to BMD in conjunction with the delta method. Therefore, factors, such as small sample size and nonlinearity in model parameters, can affect the performance of the delta method BMDL, and alternative methods are useful. In this article, we propose a bootstrap method to estimate BMDL utilizing a scheme that consists of a resampling of residuals after model fitting and a one-step formula for parameter estimation. We illustrate the method with clustered binary data from developmental toxicity experiments. Our analysis shows that with moderately elevated dose-response data, the distribution of BMD estimator tends to be left-skewed and bootstrap BMDL s are smaller than the delta method BMDL s on average, hence quantifying risk more conservatively. Statistically, the bootstrap BMDL quantifies the uncertainty of the true BMD more honestly than the delta method BMDL as its coverage probability is closer to the nominal level than that of delta method BMDL. We find that BMD and BMDL estimates are generally insensitive to model choices provided that the models fit the data comparably well near the region of BMD. Our analysis also suggests that, in the presence of a significant and moderately strong dose-response relationship, the developmental toxicity experiments under the standard protocol support dose-response assessment at 5% BMR for BMD and 95% confidence level for BMDL.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号