首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   966篇
  免费   31篇
管理学   184篇
民族学   9篇
人口学   85篇
丛书文集   7篇
理论方法论   98篇
综合类   8篇
社会学   460篇
统计学   146篇
  2023年   12篇
  2021年   9篇
  2020年   16篇
  2019年   13篇
  2018年   20篇
  2017年   27篇
  2016年   21篇
  2015年   19篇
  2014年   34篇
  2013年   253篇
  2012年   44篇
  2011年   25篇
  2010年   23篇
  2009年   29篇
  2008年   27篇
  2007年   20篇
  2006年   17篇
  2005年   24篇
  2004年   28篇
  2003年   11篇
  2002年   18篇
  2001年   27篇
  2000年   20篇
  1999年   15篇
  1998年   12篇
  1997年   16篇
  1996年   13篇
  1995年   4篇
  1994年   12篇
  1993年   9篇
  1992年   13篇
  1991年   15篇
  1990年   12篇
  1989年   5篇
  1988年   9篇
  1987年   12篇
  1986年   17篇
  1985年   5篇
  1984年   7篇
  1983年   7篇
  1982年   8篇
  1980年   9篇
  1979年   6篇
  1978年   8篇
  1977年   6篇
  1976年   4篇
  1975年   5篇
  1970年   4篇
  1969年   5篇
  1963年   3篇
排序方式: 共有997条查询结果,搜索用时 15 毫秒
1.
ABSTRACT

The cost and time of pharmaceutical drug development continue to grow at rates that many say are unsustainable. These trends have enormous impact on what treatments get to patients, when they get them and how they are used. The statistical framework for supporting decisions in regulated clinical development of new medicines has followed a traditional path of frequentist methodology. Trials using hypothesis tests of “no treatment effect” are done routinely, and the p-value < 0.05 is often the determinant of what constitutes a “successful” trial. Many drugs fail in clinical development, adding to the cost of new medicines, and some evidence points blame at the deficiencies of the frequentist paradigm. An unknown number effective medicines may have been abandoned because trials were declared “unsuccessful” due to a p-value exceeding 0.05. Recently, the Bayesian paradigm has shown utility in the clinical drug development process for its probability-based inference. We argue for a Bayesian approach that employs data from other trials as a “prior” for Phase 3 trials so that synthesized evidence across trials can be utilized to compute probability statements that are valuable for understanding the magnitude of treatment effect. Such a Bayesian paradigm provides a promising framework for improving statistical inference and regulatory decision making.  相似文献   
2.
Summary. We model daily catches of fishing boats in the Grand Bank fishing grounds. We use data on catches per species for a number of vessels collected by the European Union in the context of the Northwest Atlantic Fisheries Organization. Many variables can be thought to influence the amount caught: a number of ship characteristics (such as the size of the ship, the fishing technique used and the mesh size of the nets) are obvious candidates, but one can also consider the season or the actual location of the catch. Our database leads to 28 possible regressors (arising from six continuous variables and four categorical variables, whose 22 levels are treated separately), resulting in a set of 177 million possible linear regression models for the log-catch. Zero observations are modelled separately through a probit model. Inference is based on Bayesian model averaging, using a Markov chain Monte Carlo approach. Particular attention is paid to the prediction of catches for single and aggregated ships.  相似文献   
3.
Time, Self, and the Curiously Abstract Concept of Agency*   总被引:2,自引:0,他引:2  
The term "agency" is quite slippery and is used differently depending on the epistemological roots and goals of scholars who employ it. Distressingly, the sociological literature on the concept rarely addresses relevant social psychological research. We take a social behaviorist approach to agency by suggesting that individual temporal orientations are underutilized in conceptualizing this core sociological concept. Different temporal foci—the actor's engaged response to situational circumstances—implicate different forms of agency. This article offers a theoretical model involving four analytical types of agency ("existential,""identity,""pragmatic," and "life course") that are often conflated across treatments of the topic. Each mode of agency overlaps with established social psychological literatures, most notably about the self, enabling scholars to anchor overly abstract treatments of agency within established research literatures.  相似文献   
4.
5.
Family therapy has not served battered women well. Men use battering to silence women; a woman, once abused, is unlikely to speak honestly in a situation where doing so invites re-abuse. Therefore we rarely perceive, label or deal effectively with male violence toward women, a major source of marital disruption. To stand with the oppressed, we must learn to detect the possibility of abuse, separate the couple, and refuse to collude with criminal acts.  相似文献   
6.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.  相似文献   
7.
This work develops a new methodology in order to discriminate models for interval- censored data based on bootstrap residual simulation by observing the deviance difference from one model in relation to another, according to Hinde (1992). Generally, this sort of data can generate a large number of tied observations and, in this case, survival time can be regarded as discrete. Therefore, the Cox proportional hazards model for grouped data (Prentice & Gloeckler, 1978) and the logistic model (Lawless, 1982) can be fitted by means of generalized linear models. Whitehead (1989) considered censoring to be an indicative variable with a binomial distribution and fitted the Cox proportional hazards model using complementary log-log as a link function. In addition, a logistic model can be fitted using logit as a link function. The proposed methodology arises as an alternative to the score tests developed by Colosimo et al. (2000), where such models can be obtained for discrete binary data as particular cases from the Aranda-Ordaz distribution asymmetric family. These tests are thus developed with a basis on link functions to generate such a fit. The example that motivates this study was the dataset from an experiment carried out on a flax cultivar planted on four substrata susceptible to the pathogen Fusarium oxysoprum . The response variable, which is the time until blighting, was observed in intervals during 52 days. The results were compared with the model fit and the AIC values.  相似文献   
8.
9.
Modeling for Risk Assessment of Neurotoxic Effects   总被引:2,自引:0,他引:2  
The regulation of noncancer toxicants, including neurotoxicants, has usually been based upon a reference dose (allowable daily intake). A reference dose is obtained by dividing a no-observed-effect level by uncertainty (safety) factors to account for intraspecies and interspecies sensitivities to a chemical. It is assumed that the risk at the reference dose is negligible, but no attempt generally is made to estimate the risk at the reference dose. A procedure is outlined that provides estimates of risk as a function of dose. The first step is to establish a mathematical relationship between a biological effect and the dose of a chemical. Knowledge of biological mechanisms and/or pharmacokinetics can assist in the choice of plausible mathematical models. The mathematical model provides estimates of average responses as a function of dose. Secondly, estimates of risk require selection of a distribution of individual responses about the average response given by the mathematical model. In the case of a normal or lognormal distribution, only an estimate of the standard deviation is needed. The third step is to define an adverse level for a response so that the probability (risk) of exceeding that level can be estimated as a function of dose. Because a firm response level often cannot be established at which adverse biological effects occur, it may be necessary to at least establish an abnormal response level that only a small proportion of individuals would exceed in an unexposed group. That is, if a normal range of responses can be established, then the probability (risk) of abnormal responses can be estimated. In order to illustrate this process, measures of the neurotransmitter serotonin and its metabolite 5-hydroxyindoleacetic acid in specific areas of the brain of rats and monkeys are analyzed after exposure to the neurotoxicant methylene-dioxymethamphetamine. These risk estimates are compared with risk estimates from the quantal approach in which animals are classified as either abnormal or not depending upon abnormal serotonin levels.  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号