首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1280篇
  免费   52篇
  国内免费   18篇
管理学   551篇
民族学   1篇
人口学   6篇
丛书文集   38篇
理论方法论   103篇
综合类   427篇
社会学   67篇
统计学   157篇
  2024年   2篇
  2023年   14篇
  2022年   25篇
  2021年   19篇
  2020年   47篇
  2019年   36篇
  2018年   39篇
  2017年   44篇
  2016年   54篇
  2015年   44篇
  2014年   65篇
  2013年   102篇
  2012年   81篇
  2011年   82篇
  2010年   52篇
  2009年   53篇
  2008年   80篇
  2007年   59篇
  2006年   55篇
  2005年   69篇
  2004年   57篇
  2003年   32篇
  2002年   40篇
  2001年   28篇
  2000年   22篇
  1999年   23篇
  1998年   12篇
  1997年   18篇
  1996年   14篇
  1995年   15篇
  1994年   15篇
  1993年   9篇
  1992年   10篇
  1991年   4篇
  1990年   5篇
  1989年   8篇
  1988年   4篇
  1987年   5篇
  1986年   2篇
  1985年   1篇
  1984年   1篇
  1981年   3篇
排序方式: 共有1350条查询结果,搜索用时 0 毫秒
1.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.  相似文献   
2.
The quantification of the relationship between the amount of microbial organisms ingested and a specific outcome such as infection, illness, or mortality is a key aspect of quantitative risk assessment. A main problem in determining such dose-response models is the availability of appropriate data. Human feeding trials have been criticized because only young healthy volunteers are selected to participate and low doses, as often occurring in real life, are typically not considered. Epidemiological outbreak data are considered to be more valuable, but are more subject to data uncertainty. In this article, we model the dose-illness relationship based on data of 20 Salmonella outbreaks, as discussed by the World Health Organization. In particular, we model the dose-illness relationship using generalized linear mixed models and fractional polynomials of dose. The fractional polynomial models are modified to satisfy the properties of different types of dose-illness models as proposed by Teunis et al . Within these models, differences in host susceptibility (susceptible versus normal population) are modeled as fixed effects whereas differences in serovar type and food matrix are modeled as random effects. In addition, two bootstrap procedures are presented. A first procedure accounts for stochastic variability whereas a second procedure accounts for both stochastic variability and data uncertainty. The analyses indicate that the susceptible population has a higher probability of illness at low dose levels when the combination pathogen-food matrix is extremely virulent and at high dose levels when the combination is less virulent. Furthermore, the analyses suggest that immunity exists in the normal population but not in the susceptible population.  相似文献   
3.
This paper studies two models of rational behavior under uncertainty whose predictions are invariant under ordinal transformations of utility. The quantile utility model assumes that the agent maximizes some quantile of the distribution of utility. The utility mass model assumes maximization of the probability of obtaining an outcome whose utility is higher than some fixed critical value. Both models satisfy weak stochastic dominance. Lexicographic refinements satisfy strong dominance.The study of these utility models suggests a significant generalization of traditional ideas of riskiness and risk preference. We define one action to be riskier than another if the utility distribution of the latter crosses that of the former from below. The single crossing property is equivalent to a minmax spread of a random variable. With relative risk defined by the single crossing criterion, the risk preference of a quantile utility maximizer increases with the utility distribution quantile that he maximizes. The risk preference of a utility mass maximizer increases with his critical utility value.  相似文献   
4.
Economics of Radiation Protection: Equity Considerations   总被引:1,自引:1,他引:0  
In order to implement cost-benefit analysis of protective actions to reduce radiological exposures, one needs to attribute a monetary value to the avoided exposure. Recently, the International Commission on Radiological Protection has stressed the need to take into consideration not only the collective exposure to ionising radiation but also its dispersion in the population. In this paper, by using some well known and some recent results in the economics of uncertainty, we discuss how to integrate these recommendations in the valuation of the benefit of protection.  相似文献   
5.
We study, from the standpoint of coherence, comparative probabilities on an arbitrary familyE of conditional events. Given a binary relation ·, coherence conditions on · are related to de Finetti's coherent betting system: we consider their connections to the usual properties of comparative probability and to the possibility of numerical representations of ·. In this context, the numerical reference frame is that of de Finetti's coherent subjective conditional probability, which is not introduced (as in Kolmogoroff's approach) through a ratio between probability measures.Another relevant feature of our approach is that the family & need not have any particular algebraic structure, so that the ordering can be initially given for a few conditional events of interest and then possibly extended by a step-by-step procedure, preserving coherence.  相似文献   
6.
Inference in hybrid Bayesian networks using dynamic discretization   总被引:1,自引:0,他引:1  
We consider approximate inference in hybrid Bayesian Networks (BNs) and present a new iterative algorithm that efficiently combines dynamic discretization with robust propagation algorithms on junction trees. Our approach offers a significant extension to Bayesian Network theory and practice by offering a flexible way of modeling continuous nodes in BNs conditioned on complex configurations of evidence and intermixed with discrete nodes as both parents and children of continuous nodes. Our algorithm is implemented in a commercial Bayesian Network software package, AgenaRisk, which allows model construction and testing to be carried out easily. The results from the empirical trials clearly show how our software can deal effectively with different type of hybrid models containing elements of expert judgment as well as statistical inference. In particular, we show how the rapid convergence of the algorithm towards zones of high probability density, make robust inference analysis possible even in situations where, due to the lack of information in both prior and data, robust sampling becomes unfeasible.  相似文献   
7.
Topics in Microbial Risk Assessment: Dynamic Flow Tree Process   总被引:5,自引:0,他引:5  
Microbial risk assessment is emerging as a new discipline in risk assessment. A systematic approach to microbial risk assessment is presented that employs data analysis for developing parsimonious models and accounts formally for the variability and uncertainty of model inputs using analysis of variance and Monte Carlo simulation. The purpose of the paper is to raise and examine issues in conducting microbial risk assessments. The enteric pathogen Escherichia coli O157:H7 was selected as an example for this study due to its significance to public health. The framework for our work is consistent with the risk assessment components described by the National Research Council in 1983 (hazard identification; exposure assessment; dose-response assessment; and risk characterization). Exposure assessment focuses on hamburgers, cooked a range of temperatures from rare to well done, the latter typical for fast food restaurants. Features of the model include predictive microbiology components that account for random stochastic growth and death of organisms in hamburger. For dose-response modeling, Shigella data from human feeding studies were used as a surrogate for E. coli O157:H7. Risks were calculated using a threshold model and an alternative nonthreshold model. The 95% probability intervals for risk of illness for product cooked to a given internal temperature spanned five orders of magnitude for these models. The existence of even a small threshold has a dramatic impact on the estimated risk.  相似文献   
8.
For frequency counts, the situation of extra zeros often arises in biomedical applications. This is demonstrated with count data from a dental epidemiological study in Belo Horizonte (the Belo Horizonte caries prevention study) which evaluated various programmes for reducing caries. Extra zeros, however, violate the variance–mean relationship of the Poisson error structure. This extra-Poisson variation can easily be explained by a special mixture model, the zero-inflated Poisson (ZIP) model. On the basis of the ZIP model, a graphical device is presented which not only summarizes the mixing distribution but also provides visual information about the overall mean. This device can be exploited to evaluate and compare various groups. Ways are discussed to include covariates and to develop an extension of the conventional Poisson regression. Finally, a method to evaluate intervention effects on the basis of the ZIP regression model is described and applied to the data of the Belo Horizonte caries prevention study.  相似文献   
9.
使用允许长记忆参数d服从区制转换的MS—ARFIMA模型对中国月度通货膨胀路径的动态行为进行新的实证研究,结果显示:中国通货膨胀不仅均值水平和不确定性存在着“低通胀”区制和“高通胀”区制,而且更为重要的是,通货膨胀序列的平稳性也表现出显著的区制转换动态。“低通胀”区制下,长记忆参数d1=0.361,说明通货膨胀是协方差平稳序列,“高通胀”区制下,长记忆参数d2=1.145,说明通货膨胀是非平稳序列。这一新的研究结论意味着中国通货膨胀冲击的持久性效应也存在相应的区制转移变化。这要求央行在管控通货膨胀过程中,既要考虑均值和不确定性的区制变化,又要兼顾平稳性和持久性的区制变化。  相似文献   
10.
目前,对Granger因果关系的研究大多数采用两变量Granger因果检验法,由于忽视其它重要变量的影响,常会导致虚假因果关系的出现。鉴此,采用Granger因果图模型方法分析中国及其主要贸易伙伴国(地区)间的物价传递,研究结果表明:美国在物价传递中发挥着主导作用,物价国际间传递存在一定的区域效应;除和中国香港地区存在即期因果关系外,中国对主要贸易伙伴国(地区)的物价水平基本无显著影响,中国既无输出通货膨胀也无输出通货紧缩。同时,样本期内中国物价水平呈现明显的外部"输入性"特征。因此,中国政府应采取措施应对国际的物价冲击,同时防范物价输入性引发的风险,以实现中国物价的稳定。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号