首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1778篇
  免费   42篇
  国内免费   3篇
管理学   77篇
民族学   5篇
人口学   17篇
丛书文集   17篇
理论方法论   13篇
综合类   165篇
社会学   52篇
统计学   1477篇
  2023年   10篇
  2022年   9篇
  2021年   11篇
  2020年   28篇
  2019年   76篇
  2018年   87篇
  2017年   143篇
  2016年   41篇
  2015年   43篇
  2014年   57篇
  2013年   470篇
  2012年   142篇
  2011年   48篇
  2010年   48篇
  2009年   47篇
  2008年   60篇
  2007年   55篇
  2006年   56篇
  2005年   46篇
  2004年   41篇
  2003年   38篇
  2002年   41篇
  2001年   35篇
  2000年   34篇
  1999年   32篇
  1998年   20篇
  1997年   13篇
  1996年   13篇
  1995年   6篇
  1994年   5篇
  1993年   5篇
  1992年   9篇
  1991年   9篇
  1990年   5篇
  1989年   1篇
  1988年   6篇
  1987年   1篇
  1986年   3篇
  1985年   6篇
  1984年   4篇
  1983年   7篇
  1982年   5篇
  1981年   2篇
  1980年   2篇
  1979年   2篇
  1977年   1篇
排序方式: 共有1823条查询结果,搜索用时 15 毫秒
11.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.  相似文献   
12.
13.
利用抽样检验特征曲线 ,分析了一次计数抽样检验中N ,n ,Ac3个参数的改变对生产者风险和消费者风险的影响  相似文献   
14.
悬浮液进样石墨炉原子吸收光谱法测定土壤样品中微量硒   总被引:4,自引:0,他引:4  
采用悬浮液进样石墨炉原子吸收光谱法测定土壤样品中微量硒,选用钯-抗坏血酸为基体改进剂、锆涂覆石墨管。以标准物质制作校正曲线等一系列措施,有效消防共存元素的干扰,并保证了标准系列与土壤样品的原子化条件的一致性,获得了满意的分析结果。  相似文献   
15.
本文通过在Nyquist采样率下对信号的频谱是否发生失真,发生了怎样的失真的讨论,分析了在进行离散傅立叶变换(DFT)时,采样定理在其边界条件下进行信号处理时存在的问题。展示了在该条件下频谱发生严重失真的形式和程度,最后从无失真地恢复时域信号x(t)的角度讨论了DFT意义下采样定理的表述。  相似文献   
16.
作者运用教育心理学的课堂成就动机理论结合化学教学实际 ,选取了 14种化学学习动机进行问卷调查 ,并进行总体统计分析和抽样统计分析 ,得出了这样的结论。化学教师可以在教学中有效地培养 ,激发和强化学生的化学学习动机 ,提高教与学的效率 ,更好地培养学生的创新精神和实践能力。  相似文献   
17.
Annual concentrations of toxic air contaminants are of primary concern from the perspective of chronic human exposure assessment and risk analysis. Despite recent advances in air quality monitoring technology, resource and technical constraints often impose limitations on the availability of a sufficient number of ambient concentration measurements for performing environmental risk analysis. Therefore, sample size limitations, representativeness of data, and uncertainties in the estimated annual mean concentration must be examined before performing quantitative risk analysis. In this paper, we discuss several factors that need to be considered in designing field-sampling programs for toxic air contaminants and in verifying compliance with environmental regulations. Specifically, we examine the behavior of SO2, TSP, and CO data as surrogates for toxic air contaminants and as examples of point source, area source, and line source-dominated pollutants, respectively, from the standpoint of sampling design. We demonstrate the use of bootstrap resampling method and normal theory in estimating the annual mean concentration and its 95% confidence bounds from limited sampling data, and illustrate the application of operating characteristic (OC) curves to determine optimum sample size and other sampling strategies. We also outline a statistical procedure, based on a one-sided t-test, that utilizes the sampled concentration data for evaluating whether a sampling site is compliance with relevant ambient guideline concentrations for toxic air contaminants.  相似文献   
18.
This article introduces a new model for transaction prices in the presence of market microstructure noise in order to study the properties of the price process on two different time scales, namely, transaction time where prices are sampled with every transaction and tick time where prices are sampled with every price change. Both sampling schemes have been used in the literature on realized variance, but a formal investigation into their properties has been lacking. Our empirical and theoretical results indicate that the return dynamics in transaction time are very different from those in tick time and the choice of sampling scheme can therefore have an important impact on the properties of realized variance. For RV we find that tick time sampling is superior to transaction time sampling in terms of mean-squared-error, especially when the level of noise, number of ticks, or the arrival frequency of efficient price moves is low. Importantly, we show that while the microstructure noise may appear close to IID in transaction time, in tick time it is highly dependent. As a result, bias correction procedures that rely on the noise being independent, can fail in tick time and are better implemented in transaction time.  相似文献   
19.
The authors consider the optimal design of sampling schedules for binary sequence data. They propose an approach which allows a variety of goals to be reflected in the utility function by including deterministic sampling cost, a term related to prediction, and if relevant, a term related to learning about a treatment effect To this end, they use a nonparametric probability model relying on a minimal number of assumptions. They show how their assumption of partial exchangeability for the binary sequence of data allows the sampling distribution to be written as a mixture of homogeneous Markov chains of order k. The implementation follows the approach of Quintana & Müller (2004), which uses a Dirichlet process prior for the mixture.  相似文献   
20.
To reduce nonresponse bias in sample surveys, a method of nonresponse weighting adjustment is often used which consists of multiplying the sampling weight of the respondent by the inverse of the estimated response probability. The authors examine the asymptotic properties of this estimator. They prove that it is generally more efficient than an estimator which uses the true response probability, provided that the parameters which govern this probability are estimated by maximum likelihood. The authors discuss variance estimation methods that account for the effect of using the estimated response probability; they compare their performances in a small simulation study. They also discuss extensions to the regression estimator.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号