首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   968篇
  免费   35篇
管理学   175篇
民族学   4篇
人口学   77篇
丛书文集   7篇
理论方法论   102篇
综合类   18篇
社会学   495篇
统计学   125篇
  2023年   10篇
  2021年   5篇
  2020年   16篇
  2019年   7篇
  2018年   18篇
  2017年   24篇
  2016年   13篇
  2015年   21篇
  2014年   25篇
  2013年   258篇
  2012年   26篇
  2011年   21篇
  2010年   28篇
  2009年   32篇
  2008年   31篇
  2007年   22篇
  2006年   25篇
  2005年   23篇
  2004年   28篇
  2003年   16篇
  2002年   19篇
  2001年   25篇
  2000年   20篇
  1999年   16篇
  1998年   17篇
  1997年   17篇
  1996年   16篇
  1995年   9篇
  1994年   14篇
  1993年   9篇
  1992年   12篇
  1991年   16篇
  1990年   14篇
  1989年   5篇
  1988年   9篇
  1987年   13篇
  1986年   16篇
  1985年   5篇
  1984年   6篇
  1983年   9篇
  1982年   8篇
  1981年   5篇
  1980年   9篇
  1979年   7篇
  1978年   8篇
  1977年   6篇
  1975年   5篇
  1971年   4篇
  1970年   4篇
  1969年   4篇
排序方式: 共有1003条查询结果,搜索用时 218 毫秒
1.
ABSTRACT

The cost and time of pharmaceutical drug development continue to grow at rates that many say are unsustainable. These trends have enormous impact on what treatments get to patients, when they get them and how they are used. The statistical framework for supporting decisions in regulated clinical development of new medicines has followed a traditional path of frequentist methodology. Trials using hypothesis tests of “no treatment effect” are done routinely, and the p-value < 0.05 is often the determinant of what constitutes a “successful” trial. Many drugs fail in clinical development, adding to the cost of new medicines, and some evidence points blame at the deficiencies of the frequentist paradigm. An unknown number effective medicines may have been abandoned because trials were declared “unsuccessful” due to a p-value exceeding 0.05. Recently, the Bayesian paradigm has shown utility in the clinical drug development process for its probability-based inference. We argue for a Bayesian approach that employs data from other trials as a “prior” for Phase 3 trials so that synthesized evidence across trials can be utilized to compute probability statements that are valuable for understanding the magnitude of treatment effect. Such a Bayesian paradigm provides a promising framework for improving statistical inference and regulatory decision making.  相似文献   
2.
Time, Self, and the Curiously Abstract Concept of Agency*   总被引:2,自引:0,他引:2  
The term "agency" is quite slippery and is used differently depending on the epistemological roots and goals of scholars who employ it. Distressingly, the sociological literature on the concept rarely addresses relevant social psychological research. We take a social behaviorist approach to agency by suggesting that individual temporal orientations are underutilized in conceptualizing this core sociological concept. Different temporal foci—the actor's engaged response to situational circumstances—implicate different forms of agency. This article offers a theoretical model involving four analytical types of agency ("existential,""identity,""pragmatic," and "life course") that are often conflated across treatments of the topic. Each mode of agency overlaps with established social psychological literatures, most notably about the self, enabling scholars to anchor overly abstract treatments of agency within established research literatures.  相似文献   
3.
4.
5.
Family therapy has not served battered women well. Men use battering to silence women; a woman, once abused, is unlikely to speak honestly in a situation where doing so invites re-abuse. Therefore we rarely perceive, label or deal effectively with male violence toward women, a major source of marital disruption. To stand with the oppressed, we must learn to detect the possibility of abuse, separate the couple, and refuse to collude with criminal acts.  相似文献   
6.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.  相似文献   
7.
8.
9.
Modeling for Risk Assessment of Neurotoxic Effects   总被引:2,自引:0,他引:2  
The regulation of noncancer toxicants, including neurotoxicants, has usually been based upon a reference dose (allowable daily intake). A reference dose is obtained by dividing a no-observed-effect level by uncertainty (safety) factors to account for intraspecies and interspecies sensitivities to a chemical. It is assumed that the risk at the reference dose is negligible, but no attempt generally is made to estimate the risk at the reference dose. A procedure is outlined that provides estimates of risk as a function of dose. The first step is to establish a mathematical relationship between a biological effect and the dose of a chemical. Knowledge of biological mechanisms and/or pharmacokinetics can assist in the choice of plausible mathematical models. The mathematical model provides estimates of average responses as a function of dose. Secondly, estimates of risk require selection of a distribution of individual responses about the average response given by the mathematical model. In the case of a normal or lognormal distribution, only an estimate of the standard deviation is needed. The third step is to define an adverse level for a response so that the probability (risk) of exceeding that level can be estimated as a function of dose. Because a firm response level often cannot be established at which adverse biological effects occur, it may be necessary to at least establish an abnormal response level that only a small proportion of individuals would exceed in an unexposed group. That is, if a normal range of responses can be established, then the probability (risk) of abnormal responses can be estimated. In order to illustrate this process, measures of the neurotransmitter serotonin and its metabolite 5-hydroxyindoleacetic acid in specific areas of the brain of rats and monkeys are analyzed after exposure to the neurotoxicant methylene-dioxymethamphetamine. These risk estimates are compared with risk estimates from the quantal approach in which animals are classified as either abnormal or not depending upon abnormal serotonin levels.  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号