全文获取类型
收费全文 | 2976篇 |
免费 | 794篇 |
专业分类
管理学 | 735篇 |
民族学 | 8篇 |
人口学 | 82篇 |
丛书文集 | 1篇 |
理论方法论 | 611篇 |
综合类 | 3篇 |
社会学 | 1931篇 |
统计学 | 399篇 |
出版年
2023年 | 11篇 |
2022年 | 7篇 |
2021年 | 70篇 |
2020年 | 148篇 |
2019年 | 314篇 |
2018年 | 149篇 |
2017年 | 234篇 |
2016年 | 264篇 |
2015年 | 275篇 |
2014年 | 250篇 |
2013年 | 427篇 |
2012年 | 206篇 |
2011年 | 200篇 |
2010年 | 167篇 |
2009年 | 132篇 |
2008年 | 170篇 |
2007年 | 103篇 |
2006年 | 119篇 |
2005年 | 93篇 |
2004年 | 94篇 |
2003年 | 80篇 |
2002年 | 64篇 |
2001年 | 64篇 |
2000年 | 53篇 |
1999年 | 14篇 |
1998年 | 3篇 |
1997年 | 4篇 |
1996年 | 8篇 |
1995年 | 2篇 |
1994年 | 6篇 |
1993年 | 8篇 |
1992年 | 3篇 |
1991年 | 6篇 |
1990年 | 3篇 |
1989年 | 7篇 |
1988年 | 6篇 |
1987年 | 1篇 |
1984年 | 1篇 |
1983年 | 1篇 |
1982年 | 1篇 |
1979年 | 2篇 |
排序方式: 共有3770条查询结果,搜索用时 15 毫秒
991.
We consider hypothesis testing problems for low‐dimensional coefficients in a high dimensional additive hazard model. A variance reduced partial profiling estimator (VRPPE) is proposed and its asymptotic normality is established, which enables us to test the significance of each single coefficient when the data dimension is much larger than the sample size. Based on the p‐values obtained from the proposed test statistics, we then apply a multiple testing procedure to identify significant coefficients and show that the false discovery rate can be controlled at the desired level. The proposed method is also extended to testing a low‐dimensional sub‐vector of coefficients. The finite sample performance of the proposed testing procedure is evaluated by simulation studies. We also apply it to two real data sets, with one focusing on testing low‐dimensional coefficients and the other focusing on identifying significant coefficients through the proposed multiple testing procedure. 相似文献
992.
This paper focuses on efficient estimation, optimal rates of convergence and effective algorithms in the partly linear additive hazards regression model with current status data. We use polynomial splines to estimate both cumulative baseline hazard function with monotonicity constraint and nonparametric regression functions with no such constraint. We propose a simultaneous sieve maximum likelihood estimation for regression parameters and nuisance parameters and show that the resultant estimator of regression parameter vector is asymptotically normal and achieves the semiparametric information bound. In addition, we show that rates of convergence for the estimators of nonparametric functions are optimal. We implement the proposed estimation through a backfitting algorithm on generalized linear models. We conduct simulation studies to examine the finite‐sample performance of the proposed estimation method and present an analysis of renal function recovery data for illustration. 相似文献
993.
Fabienne Comte Celine Duval Valentine Genon‐Catalot Johanna Kappus 《Scandinavian Journal of Statistics》2015,42(4):1023-1044
In this paper, we consider a mixed compound Poisson process, that is, a random sum of independent and identically distributed (i.i.d.) random variables where the number of terms is a Poisson process with random intensity. We study nonparametric estimators of the jump density by specific deconvolution methods. Firstly, assuming that the random intensity has exponential distribution with unknown expectation, we propose two types of estimators based on the observation of an i.i.d. sample. Risks bounds and adaptive procedures are provided. Then, with no assumption on the distribution of the random intensity, we propose two non‐parametric estimators of the jump density based on the joint observation of the number of jumps and the random sum of jumps. Risks bounds are provided, leading to unusual rates for one of the two estimators. The methods are implemented and compared via simulations. 相似文献
994.
995.
996.
Risk patterns in drug safety study using relative times by accelerated failure time models when proportional hazards assumption is questionable: an illustrative case study of cancer risk of patients on glucose‐lowering therapies 下载免费PDF全文
Edmond S.‐W. Ng Olaf H. Klungel Rolf H. H. Groenwold Tjeerd‐Pieter van Staa 《Pharmaceutical statistics》2015,14(5):382-394
Observational drug safety studies may be susceptible to confounding or protopathic bias. This bias may cause a spurious relationship between drug exposure and adverse side effect when none exists and may lead to unwarranted safety alerts. The spurious relationship may manifest itself through substantially different risk levels between exposure groups at the start of follow‐up when exposure is deemed too short to have any plausible biological effect of the drug. The restrictive proportional hazards assumption with its arbitrary choice of baseline hazard function renders the commonly used Cox proportional hazards model of limited use for revealing such potential bias. We demonstrate a fully parametric approach using accelerated failure time models with an illustrative safety study of glucose‐lowering therapies and show that its results are comparable against other methods that allow time‐varying exposure effects. Our approach includes a wide variety of models that are based on the flexible generalized gamma distribution and allows direct comparisons of estimated hazard functions following different exposure‐specific distributions of survival times. This approach lends itself to two alternative metrics, namely relative times and difference in times to event, allowing physicians more ways to communicate patient's prognosis without invoking the concept of risks, which some may find hard to grasp. In our illustrative case study, substantial differences in cancer risks at drug initiation followed by a gradual reduction towards null were found. This evidence is compatible with the presence of protopathic bias, in which undiagnosed symptoms of cancer lead to switches in diabetes medication. Copyright © 2015 John Wiley & Sons, Ltd. 相似文献
997.
Experimental designs for detecting synergy and antagonism between two drugs in a pre‐clinical study 下载免费PDF全文
Matthew Sperrin Helene Thygesen Ting‐Li Su Chris Harbron Anne Whitehead 《Pharmaceutical statistics》2015,14(3):216-225
The identification of synergistic interactions between combinations of drugs is an important area within drug discovery and development. Pre‐clinically, large numbers of screening studies to identify synergistic pairs of compounds can often be ran, necessitating efficient and robust experimental designs. We consider experimental designs for detecting interaction between two drugs in a pre‐clinical in vitro assay in the presence of uncertainty of the monotherapy response. The monotherapies are assumed to follow the Hill equation with common lower and upper asymptotes, and a common variance. The optimality criterion used is the variance of the interaction parameter. We focus on ray designs and investigate two algorithms for selecting the optimum set of dose combinations. The first is a forward algorithm in which design points are added sequentially. This is found to give useful solutions in simple cases but can lack robustness when knowledge about the monotherapy parameters is insufficient. The second algorithm is a more pragmatic approach where the design points are constrained to be distributed log‐normally along the rays and monotherapy doses. We find that the pragmatic algorithm is more stable than the forward algorithm, and even when the forward algorithm has converged, the pragmatic algorithm can still out‐perform it. Practically, we find that good designs for detecting an interaction have equal numbers of points on monotherapies and combination therapies, with those points typically placed in positions where a 50% response is expected. More uncertainty in monotherapy parameters leads to an optimal design with design points that are more spread out. Copyright © 2015 John Wiley & Sons, Ltd. 相似文献
998.
999.
Juan‐Jos Ganuza Jos S. Penalva 《Econometrica : journal of the Econometric Society》2010,78(3):1007-1030
This paper provides a novel approach to ordering signals based on the property that more informative signals lead to greater variability of conditional expectations. We define two nested information criteria (supermodular precision and integral precision) by combining this approach with two variability orders (dispersive and convex orders). We relate precision criteria with orderings based on the value of information to a decision maker. We then use precision to study the incentives of an auctioneer to supply private information. Using integral precision, we obtain two results: (i) a more precise signal yields a more efficient allocation; (ii) the auctioneer provides less than the efficient level of information. Supermodular precision allows us to extend the previous analysis to the case in which supplying information is costly and to obtain an additional finding; (iii) there is a complementarity between information and competition, so that both the socially efficient and the auctioneer's optimal choice of precision increase with the number of bidders. 相似文献
1000.
Bruno Biais Thomas Mariotti Jean‐Charles Rochet Stphane Villeneuve 《Econometrica : journal of the Econometric Society》2010,78(1):73-118
We study a continuous‐time principal–agent model in which a risk‐neutral agent with limited liability must exert unobservable effort to reduce the likelihood of large but relatively infrequent losses. Firm size can be decreased at no cost or increased subject to adjustment costs. In the optimal contract, investment takes place only if a long enough period of time elapses with no losses occurring. Then, if good performance continues, the agent is paid. As soon as a loss occurs, payments to the agent are suspended, and so is investment if further losses occur. Accumulated bad performance leads to downsizing. We derive explicit formulae for the dynamics of firm size and its asymptotic growth rate, and we provide conditions under which firm size eventually goes to zero or grows without bounds. 相似文献