首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   198篇
  免费   14篇
  国内免费   1篇
管理学   138篇
人口学   3篇
丛书文集   1篇
理论方法论   5篇
综合类   27篇
社会学   4篇
统计学   35篇
  2023年   2篇
  2022年   2篇
  2021年   3篇
  2020年   6篇
  2019年   10篇
  2018年   8篇
  2017年   10篇
  2016年   3篇
  2015年   5篇
  2014年   5篇
  2013年   23篇
  2012年   8篇
  2011年   11篇
  2010年   10篇
  2009年   3篇
  2008年   5篇
  2007年   7篇
  2006年   11篇
  2005年   9篇
  2004年   6篇
  2003年   7篇
  2002年   7篇
  2001年   4篇
  2000年   2篇
  1999年   7篇
  1998年   4篇
  1997年   5篇
  1996年   4篇
  1995年   2篇
  1994年   3篇
  1993年   2篇
  1992年   3篇
  1991年   2篇
  1990年   2篇
  1989年   1篇
  1988年   1篇
  1987年   1篇
  1985年   1篇
  1984年   5篇
  1982年   1篇
  1981年   1篇
  1977年   1篇
排序方式: 共有213条查询结果,搜索用时 518 毫秒
91.
Taguchi (1986) has derived tolerances for subcomponents, subsystems, parts and materials in which the relationship between a higher-level (Y) and a lower-level (X) quality characteristic is assumed to be deterministic and linear, namely, Y=α+βX, without an error term. Tsai (1990) developed a probabilistic tolerance design for a subsystem in which a bivariate normal distribution between the above two quality characteristics as well as Taguchi's quadratic loss function were considered together to develop a closed form solution of the tolerance design for a subsystem. The Burr family is very rich for fitting sample data, and has positive domain. A bivariate Burr distribution can describe a nonlinear relationship between two quality characteristics, hence, it is adopted instead of a bivariate normal distribution and the simple solutions of three probabilistic tolerance desings for a subsystem are obtained for three cases of “nominal-is-best”, “smaller-is-berrer”, and “larger-is-beter” quality characteristics, by using Taguchi’ los functions, respectively.  相似文献   
92.
93.
In the literature there are at least two models for probabilistic belief revision: Bayesian updating and imaging [Lewis, D. K. (1973), Counterfactuals, Blackwell, Oxford; Gärdenfors, P. (1988), Knowledge in flux: modeling the dynamics of epistemic states, MIT Press, Cambridge, MA]. In this paper we focus on imaging rules that can be described by the following procedure: (1) Identify every state with some real valued vector of characteristics, and accordingly identify every probabilistic belief with an expected vector of characteristics; (2) For every initial belief and every piece of information, choose the revised belief which is compatible with this information and for which the expected vector of characteristics has minimal Euclidean distance to the expected vector of characteristics of the initial belief. This class of rules thus satisfies an intuitive notion of minimal belief revision. The main result in this paper is to provide an axiomatic characterization of this class of imaging rules.  相似文献   
94.
The leaching of organotin (OT) heat stabilizers from polyvinyl chloride (PVC) pipes used in residential drinking water systems may affect the quality of drinking water. These OTs, principally mono- and di-substituted species of butyltins and methyltins, are a potential health concern because they belong to a broad class of compounds that may be immune, nervous, and reproductive system toxicants. In this article, we develop probability distributions of U.S. population exposures to mixtures of OTs encountered in drinking water transported by PVC pipes. We employed a family of mathematical models to estimate OT leaching rates from PVC pipe as a function of both surface area and time. We then integrated the distribution of estimated leaching rates into an exposure model that estimated the probability distribution of OT concentrations in tap waters and the resulting potential human OT exposures via tap water consumption. Our study results suggest that human OT exposures through tap water consumption are likely to be considerably lower than the World Health Organization (WHO) "safe" long-term concentration in drinking water (150 μg/L) for dibutyltin (DBT)—the most toxic of the OT considered in this article. The 90th percentile average daily dose (ADD) estimate of 0.034 ± 2.92 × 10−4μg/kg day is approximately 120 times lower than the WHO-based ADD for DBT (4.2 μg/kg day).  相似文献   
95.
Most decisions in life involve ambiguity, where probabilities can not be meaningfully specified, as much as they involve probabilistic uncertainty. In such conditions, the aspiration to utility maximization may be self‐deceptive. We propose “robust satisficing” as an alternative to utility maximizing as the normative standard for rational decision making in such circumstances. Instead of seeking to maximize the expected value, or utility, of a decision outcome, robust satisficing aims to maximize the robustness to uncertainty of a satisfactory outcome. That is, robust satisficing asks, “what is a ‘good enough’ outcome,” and then seeks the option that will produce such an outcome under the widest set of circumstances. We explore the conditions under which robust satisficing is a more appropriate norm for decision making than utility maximizing.  相似文献   
96.
Principal component regression (PCR) has two steps: estimating the principal components and performing the regression using these components. These steps generally are performed sequentially. In PCR, a crucial issue is the selection of the principal components to be included in regression. In this paper, we build a hierarchical probabilistic PCR model with a dynamic component selection procedure. A latent variable is introduced to select promising subsets of components based upon the significance of the relationship between the response variable and principal components in the regression step. We illustrate this model using real and simulated examples. The simulations demonstrate that our approach outperforms some existing methods in terms of root mean squared error of the regression coefficient.  相似文献   
97.
This paper uses “revealed probability trade‐offs” to provide a natural foundation for probability weighting in the famous von Neumann and Morgenstern axiomatic set‐up for expected utility. In particular, it shows that a rank‐dependent preference functional is obtained in this set‐up when the independence axiom is weakened to stochastic dominance and a probability trade‐off consistency condition. In contrast with the existing axiomatizations of rank‐dependent utility, the resulting axioms allow for complete flexibility regarding the outcome space. Consequently, a parameter‐free test/elicitation of rank‐dependent utility becomes possible. The probability‐oriented approach of this paper also provides theoretical foundations for probabilistic attitudes towards risk. It is shown that the preference conditions that characterize the shape of the probability weighting function can be derived from simple probability trade‐off conditions.  相似文献   
98.
A probabilistic expert system provides a graphical representation of a joint probability distribution which can be used to simplify and localize calculations. Jensenet al. (1990) introduced a flow-propagation algorithm for calculating marginal and conditional distributions in such a system. This paper analyses that algorithm in detail, and shows how it can be modified to perform other tasks, including maximization of the joint density and simultaneous fast retraction of evidence entered on several variables.  相似文献   
99.
扩展概率语言词集作为一种更具通用性的语言信息表示模型,能够更加充分地描述原始评价信息,提高语言多属性决策的科学性。鉴于此,本文针对扩展概率语言环境下的多属性群决策问题,提出一种基于共识模型和ORESTE方法的多属性群决策方法。首先,给出了扩展概率语言词集的概念以及相关理论。其次,考虑到群决策过程中专家群体因知识背景以及素质能力的不同从而给出不同的评价信息导致群体意见不一致的情况,提出了扩展概率语言环境下的共识模型。再次,鉴于多数情况下备选方案间不存在单一排序顺序,本文对经典的ORESTE方法进行改进,提出扩展概率语言ORESTE方法。基于本文提出的扩展概率语言共识模型和扩展概率语言ORESTE方法,提出了扩展概率语言多属性群决策方法。最后,为了验证本文提出方法的有效性和合理性,采用共享单车设计方案评价算例进行分析,并通过与其他方法的对比分析说明本文提出方法的优越性。  相似文献   
100.
The objectives of this study are to understand tradeoffs between forest carbon and timber values, and evaluate the impact of uncertainty in improved forest management (IFM) carbon offset projects to improve forest management decisions. The study uses probabilistic simulation of uncertainty in financial risk for three management scenarios (clearcutting in 45‐ and 65‐year rotations and no harvest) under three carbon price schemes (historic voluntary market prices, cap and trade, and carbon prices set to equal net present value (NPV) from timber‐oriented management). Uncertainty is modeled for value and amount of carbon credits and wood products, the accuracy of forest growth model forecasts, and four other variables relevant to American Carbon Registry methodology. Calculations use forest inventory data from a 1,740 ha forest in western Washington State, using the Forest Vegetation Simulator (FVS) growth model. Sensitivity analysis shows that FVS model uncertainty contributes more than 70% to overall NPV variance, followed in importance by variability in inventory sample (3–14%), and short‐term prices for timber products (8%), while variability in carbon credit price has little influence (1.1%). At regional average land‐holding costs, a no‐harvest management scenario would become revenue‐positive at a carbon credit break‐point price of $14.17/Mg carbon dioxide equivalent (CO2e). IFM carbon projects are associated with a greater chance of both large payouts and large losses to landowners. These results inform policymakers and forest owners of the carbon credit price necessary for IFM approaches to equal or better the business‐as‐usual strategy, while highlighting the magnitude of financial risk and reward through probabilistic simulation.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号