首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1517篇
  免费   32篇
  国内免费   20篇
管理学   191篇
民族学   1篇
人口学   23篇
丛书文集   62篇
理论方法论   125篇
综合类   539篇
社会学   73篇
统计学   555篇
  2024年   1篇
  2023年   5篇
  2022年   16篇
  2021年   26篇
  2020年   34篇
  2019年   39篇
  2018年   27篇
  2017年   64篇
  2016年   41篇
  2015年   44篇
  2014年   62篇
  2013年   325篇
  2012年   76篇
  2011年   67篇
  2010年   60篇
  2009年   54篇
  2008年   54篇
  2007年   71篇
  2006年   62篇
  2005年   74篇
  2004年   81篇
  2003年   35篇
  2002年   42篇
  2001年   42篇
  2000年   24篇
  1999年   21篇
  1998年   11篇
  1997年   18篇
  1996年   10篇
  1995年   12篇
  1994年   11篇
  1993年   8篇
  1992年   5篇
  1991年   7篇
  1990年   10篇
  1989年   9篇
  1988年   8篇
  1986年   2篇
  1985年   1篇
  1983年   2篇
  1979年   3篇
  1978年   1篇
  1977年   1篇
  1976年   2篇
  1975年   1篇
排序方式: 共有1569条查询结果,搜索用时 78 毫秒
1.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.  相似文献   
2.
It is hypothesized that trauma and early object loss result in the arrest of the normal and healthy progression of a child's development and also disrupts a child's capacity to engage in symbolic play. In therapy, over time, with a constant object, a latency-aged child was able to re-enact early trauma and loss, make substantial gains in development, and begin to play in a symbolic and expressive way.  相似文献   
3.
It is often of interest to find the maximum or near maxima among a set of vector‐valued parameters in a statistical model; in the case of disease mapping, for example, these correspond to relative‐risk “hotspots” where public‐health intervention may be needed. The general problem is one of estimating nonlinear functions of the ensemble of relative risks, but biased estimates result if posterior means are simply substituted into these nonlinear functions. The authors obtain better estimates of extrema from a new, weighted ranks squared error loss function. The derivation of these Bayes estimators assumes a hidden‐Markov random‐field model for relative risks, and their behaviour is illustrated with real and simulated data.  相似文献   
4.
This article generalizes Savage's theory to include event-dependent preferences. The state space is partitioned into finitely many events. The induced preferences over consequences are assumed independent of the underlying states within, but not across, these events. This results in an additively separable representation of preferences over acts. The dependence of the preference relation over consequences on the events is represented by event-dependent mappings of the set of consequences onto itself. Given these mappings, the preferences on acts are represented by the expectation of event-dependent utilities on the consequences with respect to unique subjective probabilities on the states.Helpful discussions with David Schmeidler are gratefully acknowledged.  相似文献   
5.
We consider the problem of estimating the scale parameter of an exponential or a gamma distribution under squared error loss when the scale parameter θ is known to be greater than some fixed value θ0. Natural estimators in this setting include truncated linear functions of the sufficient statistic. Such estimators are typically inadmissible, but explicit improvements seem difficult to find. Some are presented here. A particularly interesting finding is that estimators which are admissible in the untruncated problem which take values only in the interior of the truncated parameter space are found to be inadmissible for the truncated problem.  相似文献   
6.
In quantum domains, the measurement (or observation) of one of a pair of complementary variables introduces an unavoidable uncertainty in the value of that variable's complement. Such uncertainties are negligible in Newtonian worlds, where observations can be made without appreciably disturbing the observed system. Hence, one would not expect that an observation of a non-quantum probabilistic outcome could affect a probability distribution over subsequently possible states, in a way that would conflict with classical probability calculations. This paper examines three problems in which observations appear to affect the probabilities and expected utilities of subsequent outcomes, in ways which may appear paradoxical. Deeper analysis of these problems reveals that the anomalies arise, not from paradox, but rather from faulty inferences drawn from the observations themselves. Thus the notion of quantum decision theory is disparaged.  相似文献   
7.
Pope  Robin 《Theory and Decision》2000,49(3):223-234
Expected utility theory does not directly deal with the utility of chance. It has been suggested in the literature (Samuelson, 1952, Markowitz, 1959) that this can be remedied by an approach which explicitly models the emotional consequences which give rise to the utility of chance. We refer to this as the elaborated outcomes approach. It is argued that the elaborated outcomes approach destroys the possibility of deriving a representation theorem based on the usual axioms of expected utility theory. This is shown with the help of an example due to Markowitz. It turns out that the space of conceivable lotteries over elaborated outcomes is too narrow to permit the application of the axioms. Moreover it is shown that a representation theorem does not hold for the example.  相似文献   
8.
A heteroscedastic regression based on the odd log-logistic Marshall–Olkin normal (OLLMON) distribution is defined by extending previous models. Some structural properties of this distribution are presented. The estimation of the parameters is addressed by maximum likelihood. For different parameter settings, sample sizes and some scenarios, various simulations investigate the performance of the heteroscedastic OLLMON regression. We use residual analysis to detect influential observations and to check the model assumptions. The new regression explains the mass loss of different wood species in civil construction in Brazil.  相似文献   
9.
Several models for studies related to tensile strength of materials are proposed in the literature where the size or length component has been taken to be an important factor for studying the specimens’ failure behaviour. An important model, developed on the basis of cumulative damage approach, is the three-parameter extension of the Birnbaum–Saunders fatigue model that incorporates size of the specimen as an additional variable. This model is a strong competitor of the commonly used Weibull model and stands better than the traditional models, which do not incorporate the size effect. The paper considers two such cumulative damage models, checks their compatibility with a real dataset, compares them with some of the recent toolkits, and finally recommends a model, which appears an appropriate one. Throughout the study is Bayesian based on Markov chain Monte Carlo simulation.  相似文献   
10.
This article is concerned with the analysis of a random sample from a binomial distribution when all the outcomes are zero (or unity). We discuss how elicitation of the prior can be reduced to asking the expert whether (and which of) the so-called borderline or equilibrium priors are plausible.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号