全文获取类型
收费全文 | 2260篇 |
免费 | 124篇 |
专业分类
管理学 | 339篇 |
民族学 | 20篇 |
人才学 | 2篇 |
人口学 | 203篇 |
丛书文集 | 14篇 |
理论方法论 | 340篇 |
综合类 | 11篇 |
社会学 | 1126篇 |
统计学 | 329篇 |
出版年
2023年 | 24篇 |
2022年 | 11篇 |
2021年 | 16篇 |
2020年 | 57篇 |
2019年 | 73篇 |
2018年 | 75篇 |
2017年 | 112篇 |
2016年 | 92篇 |
2015年 | 76篇 |
2014年 | 84篇 |
2013年 | 353篇 |
2012年 | 86篇 |
2011年 | 101篇 |
2010年 | 74篇 |
2009年 | 79篇 |
2008年 | 80篇 |
2007年 | 74篇 |
2006年 | 77篇 |
2005年 | 73篇 |
2004年 | 73篇 |
2003年 | 59篇 |
2002年 | 67篇 |
2001年 | 50篇 |
2000年 | 35篇 |
1999年 | 46篇 |
1998年 | 24篇 |
1997年 | 25篇 |
1996年 | 25篇 |
1995年 | 22篇 |
1994年 | 24篇 |
1993年 | 32篇 |
1992年 | 23篇 |
1991年 | 18篇 |
1990年 | 20篇 |
1989年 | 12篇 |
1988年 | 15篇 |
1987年 | 16篇 |
1986年 | 8篇 |
1985年 | 14篇 |
1984年 | 10篇 |
1983年 | 12篇 |
1982年 | 11篇 |
1981年 | 9篇 |
1980年 | 12篇 |
1979年 | 18篇 |
1978年 | 11篇 |
1977年 | 14篇 |
1976年 | 6篇 |
1975年 | 17篇 |
1974年 | 11篇 |
排序方式: 共有2384条查询结果,搜索用时 78 毫秒
1.
This paper analyses gendered mobilities in Bishkek in the space of the most popular form of public transport: the minibus, or ‘marshrutka’. As the means by which women often access various important sites of daily life, the marshrutka itself is a site of negotiation and interaction. Utilizing theories of mobility and empirical data, we argue that marshrutkas are spaces that can give rise to two dichotomous conditions: positive marshrutka experiences may increase the social mobility of female passengers and subsequently increase social empowerment and influence, while negative ones can provide the grounds for social exclusion and gender inequality. 相似文献
2.
Proportional hazards are a common assumption when designing confirmatory clinical trials in oncology. This assumption not only affects the analysis part but also the sample size calculation. The presence of delayed effects causes a change in the hazard ratio while the trial is ongoing since at the beginning we do not observe any difference between treatment arms, and after some unknown time point, the differences between treatment arms will start to appear. Hence, the proportional hazards assumption no longer holds, and both sample size calculation and analysis methods to be used should be reconsidered. The weighted log‐rank test allows a weighting for early, middle, and late differences through the Fleming and Harrington class of weights and is proven to be more efficient when the proportional hazards assumption does not hold. The Fleming and Harrington class of weights, along with the estimated delay, can be incorporated into the sample size calculation in order to maintain the desired power once the treatment arm differences start to appear. In this article, we explore the impact of delayed effects in group sequential and adaptive group sequential designs and make an empirical evaluation in terms of power and type‐I error rate of the of the weighted log‐rank test in a simulated scenario with fixed values of the Fleming and Harrington class of weights. We also give some practical recommendations regarding which methodology should be used in the presence of delayed effects depending on certain characteristics of the trial. 相似文献
3.
Christopher Chase‐Dunn Peter Grimes Eugene N. Anderson 《Revue canadienne de sociologie》2019,56(4):529-555
An understanding of the current right‐wing national and transnational social movements can benefit from comparing them to the global and national conditions operating during their last appearance in the first half of the twentieth century and by carefully comparing twentieth‐century fascism with the neofascist and right‐wing populist movements that have been emerging in the twenty‐first century. This allows us to assess the similarities and differences, and to gain insights about what could be the consequences of the reemergence of populist nationalism and fascist movements. Our study uses the comparative evolutionary world‐systems perspective to study the Global Right from 1800 to the present. We see fascism as a form of capitalism that emerges when the capitalist project is in crisis. World historical waves of right‐wing populism and fascism are caused by the cycles of globalization and deglobalization, the rise and fall of hegemonic core powers, long business cycles (the Kondratieff wave), and interactions with both Centrist Liberalism and the Global Left. We consider how crises of the global capitalist system have produced right‐wing backlashes in the past, and how a future terminal crisis of capitalism could lead to a reemergence of a new form of authoritarian global governance or a reorganized global democracy in the future. 相似文献
4.
CORRUPTION: TOP DOWN OR BOTTOM UP? 总被引:3,自引:0,他引:3
This article studies the impact of corruption on an economy with a hierarchical government. In particular, we study whether centralizing corruption within the higher level of government increases or decreases the total amount of corruption. We show that when the after-tax relative profitability of the formal sector as compared to that of the informal sector is high enough, adding a layer of government increases the total amount of corruption. By contrast, for high-enough public wages and/or an efficient monitoring technology of the bureaucratic system, centralization of corruption at the top of the government hierarchy redistributes bribe income from the lower level to the upper level. In the process, total corruption is reduced and the formal sector of the economy expands. 相似文献
5.
Alison Snow Jones 《Journal of Family and Economic Issues》2002,23(1):3-25
The association between drinking and selected job characteristics among women aged 24 to 31 is examined. Using the 1989 NLSY, women are classified as alcohol abusers or dependent based on DSM-III-R criteria or as heavy drinkers based on reported frequency of six or more drinks. Heavy drinking is negatively associated with wage and non-wage compensation. These effects diminish when human capital measures are controlled. Current alcoholism and current and past heavy drinking are strongly negatively associated with years of schooling. The association between alcoholism and job compensation and characteristics is not as strong as that seen for heavy drinking. It is not known if this is a consequence of errors in identifying alcoholic women in population-based surveys. 相似文献
6.
A NOTE ON EVANESCENT PROCESSES 总被引:1,自引:0,他引:1
This note examines the connection between μ-invariant measures for the transition function of a continuous-time Markov chain and those of its q-matrix, Q. The major result establishes a necessary and aufficient condition for a convergent μ-invariant measure for Q to be μ-inhant for the minimal transition function, P, under the assumption that P is honest. This corrects Theorem 6 of Vere-Jones (1969) and the first part of Corollary 1 of Pollett (1986), both of which assert that the above conclusion holds in the absence of this condition. The error was pointed out by E.A. van Doom (1991) and the counterexample which be presented provides the basis for the present arguments. In determining where the error occurred in the original proof, we are able to identify a simple sufficient condition for μ-invariance. 相似文献
7.
Craig H. Mallinckrodt Christopher J. Kaiser John G. Watkin Michael J. Detke Geert Molenberghs Raymond J. Carroll 《Pharmaceutical statistics》2004,3(3):171-186
The last observation carried forward (LOCF) approach is commonly utilized to handle missing values in the primary analysis of clinical trials. However, recent evidence suggests that likelihood‐based analyses developed under the missing at random (MAR) framework are sensible alternatives. The objective of this study was to assess the Type I error rates from a likelihood‐based MAR approach – mixed‐model repeated measures (MMRM) – compared with LOCF when estimating treatment contrasts for mean change from baseline to endpoint (Δ). Data emulating neuropsychiatric clinical trials were simulated in a 4 × 4 factorial arrangement of scenarios, using four patterns of mean changes over time and four strategies for deleting data to generate subject dropout via an MAR mechanism. In data with no dropout, estimates of Δ and SEΔ from MMRM and LOCF were identical. In data with dropout, the Type I error rates (averaged across all scenarios) for MMRM and LOCF were 5.49% and 16.76%, respectively. In 11 of the 16 scenarios, the Type I error rate from MMRM was at least 1.00% closer to the expected rate of 5.00% than the corresponding rate from LOCF. In no scenario did LOCF yield a Type I error rate that was at least 1.00% closer to the expected rate than the corresponding rate from MMRM. The average estimate of SEΔ from MMRM was greater in data with dropout than in complete data, whereas the average estimate of SEΔ from LOCF was smaller in data with dropout than in complete data, suggesting that standard errors from MMRM better reflected the uncertainty in the data. The results from this investigation support those from previous studies, which found that MMRM provided reasonable control of Type I error even in the presence of MNAR missingness. No universally best approach to analysis of longitudinal data exists. However, likelihood‐based MAR approaches have been shown to perform well in a variety of situations and are a sensible alternative to the LOCF approach. MNAR methods can be used within a sensitivity analysis framework to test the potential presence and impact of MNAR data, thereby assessing robustness of results from an MAR method. Copyright © 2004 John Wiley & Sons, Ltd. 相似文献
8.
Chris Jones 《Significance》2008,5(1):46-48
It rises massive and magnificent above Salisbury Plain. The extraordinary monoliths of Stonehenge leave us as impressed today as ever—and as baffled. Archaeologists argue about how it was made. Can statisticians tell the answer? Chris Jones thinks they can be of considerable assistance. 相似文献
9.
10.
We describe an image reconstruction problem and the computational difficulties arising in determining the maximum a posteriori (MAP) estimate. Two algorithms for tackling the problem, iterated conditional modes (ICM) and simulated annealing, are usually applied pixel by pixel. The performance of this strategy can be poor, particularly for heavily degraded images, and as a potential improvement Jubb and Jennison (1991) suggest the cascade algorithm in which ICM is initially applied to coarser images formed by blocking squares of pixels. In this paper we attempt to resolve certain criticisms of cascade and present a version of the algorithm extended in definition and implementation. As an illustration we apply our new method to a synthetic aperture radar (SAR) image. We also carry out a study of simulated annealing, with and without cascade, applied to a more tractable minimization problem from which we gain insight into the properties of cascade algorithms. 相似文献