首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3687篇
  免费   64篇
管理学   560篇
民族学   32篇
人才学   1篇
人口学   336篇
丛书文集   31篇
理论方法论   345篇
综合类   21篇
社会学   1891篇
统计学   534篇
  2023年   18篇
  2021年   23篇
  2020年   54篇
  2019年   77篇
  2018年   77篇
  2017年   98篇
  2016年   93篇
  2015年   81篇
  2014年   79篇
  2013年   694篇
  2012年   92篇
  2011年   108篇
  2010年   90篇
  2009年   75篇
  2008年   114篇
  2007年   114篇
  2006年   80篇
  2005年   92篇
  2004年   91篇
  2003年   89篇
  2002年   116篇
  2001年   84篇
  2000年   82篇
  1999年   64篇
  1998年   58篇
  1997年   58篇
  1996年   45篇
  1995年   61篇
  1994年   54篇
  1993年   50篇
  1992年   45篇
  1991年   52篇
  1990年   42篇
  1989年   40篇
  1988年   59篇
  1987年   57篇
  1986年   26篇
  1985年   39篇
  1984年   46篇
  1983年   57篇
  1982年   32篇
  1981年   35篇
  1980年   29篇
  1979年   34篇
  1978年   39篇
  1977年   31篇
  1976年   35篇
  1975年   28篇
  1974年   18篇
  1972年   17篇
排序方式: 共有3751条查询结果,搜索用时 8 毫秒
301.
In the nuclear power industry, Level 3 probabilistic risk assessment (PRA) is used to estimate damage to public health and the environment if a severe accident leads to large radiological release. Current Level 3 PRA does not have an explicit inclusion of social factors and, therefore, it is not possible to perform importance ranking of social factors for risk‐informing emergency preparedness, planning, and response (EPPR). This article offers a methodology for adapting the concept of social vulnerability, commonly used in natural hazard research, in the context of a severe nuclear power plant accident. The methodology has four steps: (1) calculating a hazard‐independent social vulnerability index for the local population; (2) developing a location‐specific representation of the maximum radiological hazard estimated from current Level 3 PRA, in a geographic information system (GIS) environment; (3) developing a GIS‐based socio‐technical risk map by combining the social vulnerability index and the location‐specific radiological hazard; and (4) conducting a risk importance measure analysis to rank the criticality of social factors based on their contribution to the socio‐technical risk. The methodology is applied using results from the 2012 Surry Power Station state‐of‐the‐art reactor consequence analysis. A radiological hazard model is generated from MELCOR accident consequence code system, translated into a GIS environment, and combined with the Center for Disease Control social vulnerability index (SVI). This research creates an opportunity to explicitly consider and rank the criticality of location‐specific SVI themes based on their influence on risk, providing input for EPPR.  相似文献   
302.
Organizational scholars increasingly recognize the value of employing historical research. Yet the fields of history and organization studies struggle to reconcile. In this paper, the authors contend that a closer connection between these two fields is possible if organizational historians bring their role in the construction of historical narratives to the fore and open up their research decisions for discussion. They provide guidelines to support this endeavor, drawing on four criteria that are prevalent within interpretive organization studies for developing the trustworthiness of research: credibility; confirmability; dependability; and transferability. In contrast to the traditional use of trustworthiness criteria to evaluate the quality of research, the authors advance the criteria to encourage historians to generate more transparent narratives. Such transparency allows others to comprehend and comment on the construction of narratives, thereby building trust and understanding. Each criterion is converted into a set of guiding principles to enhance the trustworthiness of historical research, pairing each principle with a practical technique gleaned from a range of disciplines within the social sciences to provide practical guidance.  相似文献   
303.
Modeling the dependence between uncertainties in decision and risk analyses is an important part of the problem structuring process. We focus on situations where correlated uncertainties are discrete, and extend the concept of the copula‐based approach for modeling correlated continuous uncertainties to the representation of correlated discrete uncertainties. This approach reduces the required number of probability assessments significantly compared to approaches requiring direct estimates of conditional probabilities. It also allows the use of multiple dependence measures, including product moment correlation, rank order correlation and tail dependence, and parametric families of copulas such as normal copulas, t‐copulas, and Archimedean copulas. This approach can be extended to model the dependence between discrete and continuous uncertainties in the same event tree.  相似文献   
304.
Research across a variety of risk domains finds that the risk perceptions of professionals and the public differ. Such risk perception gaps occur if professionals and the public understand individual risk factors differently or if they aggregate risk factors into overall risk differently. The nature of such divergences, whether based on objective inaccuracies or on differing perspectives, is important to understand. However, evidence of risk perception gaps typically pertains to general, overall risk levels; evidence of and details about mismatches between the specific level of risk faced by individuals and their perceptions of that risk is less available. We examine these issues with a paired data set of professional and resident assessments of parcel‐level wildfire risk for private property in a wildland–urban interface community located in western Colorado, United States. We find evidence of a gap between the parcel‐level risk assessments of a wildfire professional and numerous measures of residents’ risk assessments. Overall risk ratings diverge for the majority of properties, as do judgments about many specific property attributes and about the relative contribution of these attributes to a property's overall level of risk. However, overall risk gaps are not well explained by many factors commonly found to relate to risk perceptions. Understanding the nature of these risk perception gaps can facilitate improved communication by wildfire professionals about how risks can be mitigated on private lands. These results also speak to the general nature of individual‐level risk perception.  相似文献   
305.
To protect and secure food resources for the United States, it is crucial to have a method to compare food systems’ criticality. In 2007, the U.S. government funded development of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) to determine which food and agriculture systems were most critical to the nation. FASCAT was developed in a collaborative process involving government officials and food industry subject matter experts (SMEs). After development, data were collected using FASCAT to quantify threats, vulnerabilities, consequences, and the impacts on the United States from failure of evaluated food and agriculture systems. To examine FASCAT's utility, linear regression models were used to determine: (1) which groups of questions posed in FASCAT were better predictors of cumulative criticality scores; (2) whether the items included in FASCAT's criticality method or the smaller subset of FASCAT items included in DHS's risk analysis method predicted similar criticality scores. Akaike's information criterion was used to determine which regression models best described criticality, and a mixed linear model was used to shrink estimates of criticality for individual food and agriculture systems. The results indicated that: (1) some of the questions used in FASCAT strongly predicted food or agriculture system criticality; (2) the FASCAT criticality formula was a stronger predictor of criticality compared to the DHS risk formula; (3) the cumulative criticality formula predicted criticality more strongly than weighted criticality formula; and (4) the mixed linear regression model did not change the rank‐order of food and agriculture system criticality to a large degree.  相似文献   
306.
Access management, which systematically limits opportunities for egress and ingress of vehicles to highway lanes, is critical to protect trillions of dollars of current investment in transportation. This article addresses allocating resources for access management with incomplete and partially relevant data on crash rates, travel speeds, and other factors. While access management can be effective to avoid crashes, reduce travel times, and increase route capacities, the literature suggests a need for performance metrics to guide investments in resource allocation across large corridor networks and several time horizons. In this article, we describe a quantitative decision model to support an access management program via risk‐cost‐benefit analysis under data uncertainties from diverse sources of data and expertise. The approach quantifies potential benefits, including safety improvement and travel time savings, and costs of access management through functional relationships of input parameters including crash rates, corridor access point densities, and traffic volumes. Parameter uncertainties, which vary across locales and experts, are addressed via numerical interval analyses. This approach is demonstrated at several geographic scales across 7,000 kilometers of highways in a geographic region and several subregions. The demonstration prioritizes route segments that would benefit from risk management, including (i) additional data or elicitation, (ii) right‐of‐way purchases, (iii) restriction or closing of access points, (iv) new alignments, (v) developer proffers, and (vi) etc. The approach ought to be of wide interest to analysts, planners, policymakers, and stakeholders who rely on heterogeneous data and expertise for risk management.  相似文献   
307.
This paper concerns the forecasting of seasonal intraday time series that exhibit repeating intraweek and intraday cycles. A recently proposed exponential smoothing method involves smoothing a different intraday cycle for each distinct type of day of the week. Similar days are allocated identical intraday cycles. A limitation is that the method allows only whole days to be treated as identical. We introduce a new exponential smoothing formulation that allows parts of different days of the week to be treated as identical. The result is a method that involves the smoothing and initialisation of fewer terms. We evaluate forecasting up to a day ahead using two empirical studies. For electricity load data, the new method compares well with a range of alternatives. The second study involves a series of arrivals at a call centre that is open for a shorter duration at the weekends than on weekdays. Among the variety of methods considered, the new method is the only one that can model in a satisfactory way in this situation, where the number of periods on each day of the week is not the same.  相似文献   
308.
309.
Behavioral economics has captured the interest of scholars and the general public by demonstrating ways in which individuals make decisions that appear irrational. While increasing attention is being focused on the implications of this research for the design of risk‐reducing policies, less attention has been paid to how it affects the economic valuation of policy consequences. This article considers the latter issue, reviewing the behavioral economics literature and discussing its implications for the conduct of benefit‐cost analysis, particularly in the context of environmental, health, and safety regulations. We explore three concerns: using estimates of willingness to pay or willingness to accept compensation for valuation, considering the psychological aspects of risk when valuing mortality‐risk reductions, and discounting future consequences. In each case, we take the perspective that analysts should avoid making judgments about whether values are “rational” or “irrational.” Instead, they should make every effort to rely on well‐designed studies, using ranges, sensitivity analysis, or probabilistic modeling to reflect uncertainty. More generally, behavioral research has led some to argue for a more paternalistic approach to policy analysis. We argue instead for continued focus on describing the preferences of those affected, while working to ensure that these preferences are based on knowledge and careful reflection.  相似文献   
310.
A review of recently developed simple techniques for analyzing data in the form of proportions from replicated, cross-sectional, sample surveys. (A) One variable over time—tests for homogeneity, pooling homogeneous proportions, tests for linear trends with as few as three time points, tests for departures from linearity; (B) Two variables with a constant percentage difference—four fold tables as rudimentary causal models, decomposing change with linear flow graphs; (C) Extension of the flow graph techniques to—three or more variables and changing coefficients; (D) Comments implying panel designs are somewhat over-rated and successive cross-sections somewhat under-rated.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号