首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   6943篇
  免费   934篇
管理学   1104篇
民族学   63篇
人口学   570篇
丛书文集   9篇
理论方法论   935篇
综合类   139篇
社会学   3853篇
统计学   1204篇
  2023年   69篇
  2022年   60篇
  2021年   143篇
  2020年   322篇
  2019年   482篇
  2018年   471篇
  2017年   569篇
  2016年   533篇
  2015年   403篇
  2014年   438篇
  2013年   1153篇
  2012年   552篇
  2011年   359篇
  2010年   299篇
  2009年   254篇
  2008年   279篇
  2007年   210篇
  2006年   205篇
  2005年   152篇
  2004年   153篇
  2003年   134篇
  2002年   101篇
  2001年   103篇
  2000年   90篇
  1999年   34篇
  1998年   28篇
  1997年   21篇
  1996年   26篇
  1995年   17篇
  1994年   28篇
  1993年   19篇
  1992年   20篇
  1991年   19篇
  1990年   13篇
  1989年   18篇
  1988年   17篇
  1987年   10篇
  1986年   8篇
  1985年   7篇
  1984年   4篇
  1983年   5篇
  1982年   6篇
  1981年   4篇
  1980年   7篇
  1979年   5篇
  1977年   3篇
  1976年   3篇
  1975年   3篇
  1973年   4篇
  1971年   4篇
排序方式: 共有7877条查询结果,搜索用时 15 毫秒
1.
As part of the celebration of the 40th anniversary of the Society for Risk Analysis and Risk Analysis: An International Journal, this essay reviews the 10 most important accomplishments of risk analysis from 1980 to 2010, outlines major accomplishments in three major categories from 2011 to 2019, discusses how editors circulate authors’ accomplishments, and proposes 10 major risk-related challenges for 2020–2030. Authors conclude that the next decade will severely test the field of risk analysis.  相似文献   
2.
3.
Managing risk in infrastructure systems implies dealing with interdependent physical networks and their relationships with the natural and societal contexts. Computational tools are often used to support operational decisions aimed at improving resilience, whereas economics‐related tools tend to be used to address broader societal and policy issues in infrastructure management. We propose an optimization‐based framework for infrastructure resilience analysis that incorporates organizational and socioeconomic aspects into operational problems, allowing to understand relationships between decisions at the policy level (e.g., regulation) and the technical level (e.g., optimal infrastructure restoration). We focus on three issues that arise when integrating such levels. First, optimal restoration strategies driven by financial and operational factors evolve differently compared to those driven by socioeconomic and humanitarian factors. Second, regulatory aspects have a significant impact on recovery dynamics (e.g., effective recovery is most challenging in societies with weak institutions and regulation, where individual interests may compromise societal well‐being). And third, the decision space (i.e., available actions) in postdisaster phases is strongly determined by predisaster decisions (e.g., resource allocation). The proposed optimization framework addresses these issues by using: (1) parametric analyses to test the influence of operational and socioeconomic factors on optimization outcomes, (2) regulatory constraints to model and assess the cost and benefit (for a variety of actors) of enforcing specific policy‐related conditions for the recovery process, and (3) sensitivity analyses to capture the effect of predisaster decisions on recovery. We illustrate our methodology with an example regarding the recovery of interdependent water, power, and gas networks in Shelby County, TN (USA), with exposure to natural hazards.  相似文献   
4.
Perceptions of infectious diseases are important predictors of whether people engage in disease‐specific preventive behaviors. Having accurate beliefs about a given infectious disease has been found to be a necessary condition for engaging in appropriate preventive behaviors during an infectious disease outbreak, while endorsing conspiracy beliefs can inhibit preventive behaviors. Despite their seemingly opposing natures, knowledge and conspiracy beliefs may share some of the same psychological motivations, including a relationship with perceived risk and self‐efficacy (i.e., control). The 2015–2016 Zika epidemic provided an opportunity to explore this. The current research provides some exploratory tests of this topic derived from two studies with similar measures, but different primary outcomes: one study that included knowledge of Zika as a key outcome and one that included conspiracy beliefs about Zika as a key outcome. Both studies involved cross‐sectional data collections that occurred during the same two periods of the Zika outbreak: one data collection prior to the first cases of local Zika transmission in the United States (March–May 2016) and one just after the first cases of local transmission (July–August). Using ordinal logistic and linear regression analyses of data from two time points in both studies, the authors show an increase in relationship strength between greater perceived risk and self‐efficacy with both increased knowledge and increased conspiracy beliefs after local Zika transmission in the United States. Although these results highlight that similar psychological motivations may lead to Zika knowledge and conspiracy beliefs, there was a divergence in demographic association.  相似文献   
5.
In this paper, we consider the deterministic trend model where the error process is allowed to be weakly or strongly correlated and subject to non‐stationary volatility. Extant estimators of the trend coefficient are analysed. We find that under heteroskedasticity, the Cochrane–Orcutt‐type estimator (with some initial condition) could be less efficient than Ordinary Least Squares (OLS) when the process is highly persistent, whereas it is asymptotically equivalent to OLS when the process is less persistent. An efficient non‐parametrically weighted Cochrane–Orcutt‐type estimator is then proposed. The efficiency is uniform over weak or strong serial correlation and non‐stationary volatility of unknown form. The feasible estimator relies on non‐parametric estimation of the volatility function, and the asymptotic theory is provided. We use the data‐dependent smoothing bandwidth that can automatically adjust for the strength of non‐stationarity in volatilities. The implementation does not require pretesting persistence of the process or specification of non‐stationary volatility. Finite‐sample evaluation via simulations and an empirical application demonstrates the good performance of proposed estimators.  相似文献   
6.
The gendering of forestry as a distinctly masculine profession has led to a wide range of negative outcomes, including legal actions concerning discrimination, poor public perceptions and poor environmental records. Forestry organizations have addressed these concerns by attempting to increase the number of women in the profession. These efforts have been largely ineffective. Using the case of community‐based forestry, I argue that when we begin to consider not only women but also normatively feminine values as agents of change, our understanding of the profession of forestry may be rejuvenated.  相似文献   
7.
Proportional hazards are a common assumption when designing confirmatory clinical trials in oncology. This assumption not only affects the analysis part but also the sample size calculation. The presence of delayed effects causes a change in the hazard ratio while the trial is ongoing since at the beginning we do not observe any difference between treatment arms, and after some unknown time point, the differences between treatment arms will start to appear. Hence, the proportional hazards assumption no longer holds, and both sample size calculation and analysis methods to be used should be reconsidered. The weighted log‐rank test allows a weighting for early, middle, and late differences through the Fleming and Harrington class of weights and is proven to be more efficient when the proportional hazards assumption does not hold. The Fleming and Harrington class of weights, along with the estimated delay, can be incorporated into the sample size calculation in order to maintain the desired power once the treatment arm differences start to appear. In this article, we explore the impact of delayed effects in group sequential and adaptive group sequential designs and make an empirical evaluation in terms of power and type‐I error rate of the of the weighted log‐rank test in a simulated scenario with fixed values of the Fleming and Harrington class of weights. We also give some practical recommendations regarding which methodology should be used in the presence of delayed effects depending on certain characteristics of the trial.  相似文献   
8.
Open innovation and absorptive capacity are two concepts based on the idea that companies can leverage the knowledge generated externally to improve their innovation performance. The aim of this paper is to analyse the joint effect of open innovation and absorptive capacity on a firm's radical innovation. Open innovation is expressed in terms of external search breadth and depth strategies and absorptive capacity is described by distinguishing between potential and realized absorptive capacity. In order to test our hypotheses, we carried out empirical research in firms operating in high-technology industries. The results indicate that internal routines and processes for absorbing external knowledge help explain radical innovation as they show a significant effect of potential and realized absorptive capacity. Also, there is a moderating effect of absorptive capacity on open innovation. Specifically, potential absorptive capacity exerts a positive effect on the relationship between external search breadth and depth and radical innovation. Realized absorptive capacity moderates the influence of external search breadth. These findings confirm the complementary nature of absorptive capacity and open innovation search strategies on radical innovation.  相似文献   
9.
Generally, the semiclosed-form option pricing formula for complex financial models depends on unobservable factors such as stochastic volatility and jump intensity. A popular practice is to use an estimate of these latent factors to compute the option price. However, in many situations this plug-and-play approximation does not yield the appropriate price. This article examines this bias and quantifies its impacts. We decompose the bias into terms that are related to the bias on the unobservable factors and to the precision of their point estimators. The approximated price is found to be highly biased when only the history of the stock price is used to recover the latent states. This bias is corrected when option prices are added to the sample used to recover the states' best estimate. We also show numerically that such a bias is propagated on calibrated parameters, leading to erroneous values. The Canadian Journal of Statistics 48: 8–35; 2020 © 2019 Statistical Society of Canada  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号