首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   198篇
  免费   14篇
  国内免费   1篇
管理学   138篇
人口学   3篇
丛书文集   1篇
理论方法论   5篇
综合类   27篇
社会学   4篇
统计学   35篇
  2023年   2篇
  2022年   2篇
  2021年   3篇
  2020年   6篇
  2019年   10篇
  2018年   8篇
  2017年   10篇
  2016年   3篇
  2015年   5篇
  2014年   5篇
  2013年   23篇
  2012年   8篇
  2011年   11篇
  2010年   10篇
  2009年   3篇
  2008年   5篇
  2007年   7篇
  2006年   11篇
  2005年   9篇
  2004年   6篇
  2003年   7篇
  2002年   7篇
  2001年   4篇
  2000年   2篇
  1999年   7篇
  1998年   4篇
  1997年   5篇
  1996年   4篇
  1995年   2篇
  1994年   3篇
  1993年   2篇
  1992年   3篇
  1991年   2篇
  1990年   2篇
  1989年   1篇
  1988年   1篇
  1987年   1篇
  1985年   1篇
  1984年   5篇
  1982年   1篇
  1981年   1篇
  1977年   1篇
排序方式: 共有213条查询结果,搜索用时 437 毫秒
101.
Probability elicitation protocols are used to assess and incorporate subjective probabilities in risk and decision analysis. While most of these protocols use methods that have focused on the precision of the elicited probabilities, the speed of the elicitation process has often been neglected. However, speed is also important, particularly when experts need to examine a large number of events on a recurrent basis. Furthermore, most existing elicitation methods are numerical in nature, but there are various reasons why an expert would refuse to give such precise ratio‐scale estimates, even if highly numerate. This may occur, for instance, when there is lack of sufficient hard evidence, when assessing very uncertain events (such as emergent threats), or when dealing with politicized topics (such as terrorism or disease outbreaks). In this article, we adopt an ordinal ranking approach from multicriteria decision analysis to provide a fast and nonnumerical probability elicitation process. Probabilities are subsequently approximated from the ranking by an algorithm based on the principle of maximum entropy, a rule compatible with the ordinal information provided by the expert. The method can elicit probabilities for a wide range of different event types, including new ways of eliciting probabilities for stochastically independent events and low‐probability events. We use a Monte Carlo simulation to test the accuracy of the approximated probabilities and try the method in practice, applying it to a real‐world risk analysis recently conducted for DEFRA (the U.K. Department for the Environment, Farming and Rural Affairs): the prioritization of animal health threats.  相似文献   
102.
本文研究非阿基米德概率度量空间中的一类新型的多值压缩型映象的不动点问题,得到一个新的结果,推广了一些已知的结果。  相似文献   
103.
This article discusses the methodologies presently available for analyzing the contribution of "external initiators" to overall risks in the context of PRA (probabilistic risk assessment) of large commercial nuclear power reactors. "External initiators" include earthquakes, fires and floods inside the plant, external floods, high winds, aircraft, barge, and ship collisions, noxious or explosive gases offsite, and so on. These are in contrast to "internal initiators" such as active or passive plant equipment failures, human errors, and loss of electrical power. The ability to consider external initiators within PRA has undergone major advances in recent years. In general, uncertainties associated with the calculated risks from external initiators are much larger than those associated with internal initiators. The principal uncertainties lie with development of hazard curves (such as the frequency of occurrence of an event exceeding a given size: for example, the likelihood of a hurricane with winds exceeding 125 knots). For assessment of earthquakes, internal fires and floods, and high winds, the methodology is reasonably mature for qualitative assessment but not for quantitative application. The risks from other external initiators are generally considered to be low, either because of the very long recurrence time associated with the events or because the plants are judged to be well designed to withstand them.  相似文献   
104.
Measures of sensitivity and uncertainty have become an integral part of risk analysis. Many such measures have a conditional probabilistic structure, for which a straightforward Monte Carlo estimation procedure has a double‐loop form. Recently, a more efficient single‐loop procedure has been introduced, and consistency of this procedure has been demonstrated separately for particular measures, such as those based on variance, density, and information value. In this work, we give a unified proof of single‐loop consistency that applies to any measure satisfying a common rationale. This proof is not only more general but invokes less restrictive assumptions than heretofore in the literature, allowing for the presence of correlations among model inputs and of categorical variables. We examine numerical convergence of such an estimator under a variety of sensitivity measures. We also examine its application to a published medical case study.  相似文献   
105.
This paper investigates the applicability of a Monte Carlo technique known as simulated annealing to achieve optimum or sub-optimum decompositions of probabilistic networks under bounded resources. High-quality decompositions are essential for performing efficient inference in probabilistic networks. Optimum decomposition of probabilistic networks is known to be NP-hard (Wen, 1990). The paper proves that cost-function changes can be computed locally, which is essential to the efficiency of the annealing algorithm. Pragmatic control schedules which reduce the running time of the annealing algorithm are presented and evaluated. Apart from the conventional temperature parameter, these schedules involve the radius of the search space as a new control parameter. The evaluation suggests that the inclusion of this new parameter is important for the success of the annealing algorithm for the present problem.  相似文献   
106.
Ingestion of contaminated soil is one potential internal exposure pathway in areas contaminated by the Fukushima Daiichi Nuclear Power Plant accident. Doses from this pathway can be overestimated if the availability of radioactive nuclides in soils for the gastrointestinal tract is not considered. The concept of bioaccessibility has been adopted to evaluate this availability based on in vitro tests. This study evaluated the bioaccessibility of radioactive cesium from soils via the physiologically‐based extraction test (PBET) and the extractability of those via an extraction test with 1 mol/L of hydrochloric acid (HCl). The bioaccessibility obtained in the PBET was 5.3% ± 1%, and the extractability in the tests with HCl was 16% ± 3%. The bioaccessibility was strongly correlated with the extractability. This result indicates the possibility that the extractability in HCl can be used as a good predictor of the bioaccessibility with PBET. In addition, we assessed the doses to children from the ingestion of soil via hand‐to‐mouth activity based on our PBET results using a probabilistic approach considering the spatial distribution of radioactive cesium in Date City in Fukushima Prefecture and the interindividual differences in the surveyed amounts of soil ingestion in Japan. The results of this assessment indicate that even if children were to routinely ingest a large amount of soil with relatively high contamination, the radiation doses from this pathway are negligible compared with doses from external exposure owing to deposited radionuclides in Fukushima Prefecture.  相似文献   
107.
For substantiation of managerial decisions the forecasting results of dynamic indicators are used. Therefore, forecasting accuracy of these indicators must be acceptable. Consequently, forecasting algorithms are constantly improved to get the acceptable accuracy. This paper considers a variant of the method of forecasting binary outcomes. This method allows prediction of whether or not a future value of the indicator exceeds a predetermined value. This method ‘interval forecasting’ was named. In this paper a robust interval forecasting algorithm based on a probabilistic cluster model is proposed. The algorithm’s accuracy was compared with an algorithm based on logistic regression. The indicators with different statistical properties were chosen. The obtained results have shown the accuracy of both the algorithms is approximately similar in most cases. However, the cases when the algorithm based on logistic regression demonstrated unacceptable accuracy, unlike the presented algorithm have been identified. Thus, this new algorithm is more accurate.  相似文献   
108.
The end states reached by an engineered system during an accident scenario depend not only on the sequences of the events composing the scenario, but also on their timing and magnitudes. Including these additional features within an overarching framework can render the analysis infeasible in practical cases, due to the high dimension of the system state‐space and the computational effort correspondingly needed to explore the possible system evolutions in search of the interesting (and very rare) ones of failure. To tackle this hurdle, in this article we introduce a framework for efficiently probing the space of event sequences of a dynamic system by means of a guided Monte Carlo simulation. Such framework is semi‐automatic and allows embedding the analyst's prior knowledge about the system and his/her objectives of analysis. Specifically, the framework allows adaptively and intelligently allocating the simulation efforts preferably on those sequences leading to outcomes of interest for the objectives of the analysis, e.g., typically those that are more safety‐critical (and/or rare). The emerging diversification in the filling of the state‐space by the preference‐guided exploration allows also the retrieval of critical system features, which can be useful to analysts and designers for taking appropriate means of prevention and mitigation of dangerous and/or unexpected consequences. A dynamic system for gas transmission is considered as a case study to demonstrate the application of the method.  相似文献   
109.
Questions persist regarding assessment of workers’ exposures to products containing low levels of benzene, such as mineral spirit solvent (MSS). This study summarizes previously unpublished data for parts‐washing activities, and evaluates potential daily and lifetime cumulative benzene exposures incurred by workers who used historical and current formulations of a recycled mineral spirits solvent in manual parts washers. Measured benzene concentrations in historical samples from parts‐washing operations were frequently below analytical detection limits. To better assess benzene exposure among these workers, air‐to‐solvent concentration ratios measured for toluene, ethylbenzene, and xylenes (TEX) were used to predict those for benzene based on a statistical model, conditional on physical‐chemical theory supported by new thermodynamic calculations of TEX and benzene activity coefficients in a modeled MSS‐type solvent. Using probabilistic methods, the distributions of benzene concentrations were then combined with distributions of other exposure parameters to estimate eight‐hour time‐weighted average (TWA) exposure concentration distributions and corresponding daily respiratory dose distributions for workers using these solvents in parts washers. The estimated 50th (95th) percentile of the daily respiratory dose and corresponding eight‐hour TWA air concentration for workers performing parts washing are 0.079 (0.77) mg and 0.0030 (0.028) parts per million by volume (ppm) for historical solvent, and 0.020 (0.20) mg and 0.00078 (0.0075) ppm for current solvent, respectively. Both 95th percentile eight‐hour TWA respiratory exposure estimates for solvent formulations are less than 10% of the current Occupational Safety and Health Administration permissible exposure limit of 1.0 ppm for benzene.  相似文献   
110.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号