首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   20938篇
  免费   558篇
  国内免费   2篇
管理学   2954篇
民族学   113篇
人才学   4篇
人口学   2054篇
丛书文集   130篇
教育普及   1篇
理论方法论   1962篇
现状及发展   1篇
综合类   348篇
社会学   10314篇
统计学   3617篇
  2023年   113篇
  2020年   299篇
  2019年   451篇
  2018年   496篇
  2017年   691篇
  2016年   520篇
  2015年   372篇
  2014年   469篇
  2013年   3535篇
  2012年   693篇
  2011年   627篇
  2010年   509篇
  2009年   491篇
  2008年   507篇
  2007年   538篇
  2006年   489篇
  2005年   508篇
  2004年   476篇
  2003年   383篇
  2002年   447篇
  2001年   531篇
  2000年   493篇
  1999年   455篇
  1998年   334篇
  1997年   310篇
  1996年   302篇
  1995年   292篇
  1994年   273篇
  1993年   290篇
  1992年   306篇
  1991年   277篇
  1990年   287篇
  1989年   301篇
  1988年   278篇
  1987年   270篇
  1986年   254篇
  1985年   301篇
  1984年   305篇
  1983年   279篇
  1982年   234篇
  1981年   189篇
  1980年   200篇
  1979年   241篇
  1978年   187篇
  1977年   165篇
  1976年   168篇
  1975年   173篇
  1974年   170篇
  1973年   134篇
  1972年   104篇
排序方式: 共有10000条查询结果,搜索用时 460 毫秒
81.
Proportional reversed hazard rate model and its applications   总被引:1,自引:0,他引:1  
The purpose of this paper is to study the structure and properties of the proportional reversed hazard rate model (PRHRM) in contrast to the celebrated proportional hazard model (PHM). The monotonicity of the hazard rate and the reversed hazard rate of the model is investigated. Some criteria of aging are presented and the inheritance of the aging notions (of the base distribution) by the PRHRM is studied. Characterizations of the model involving Fisher information are presented and the statistical inference of the parameters is discussed. Finally, it is shown that several members of the proportional reversed hazard rate class have been found to be useful and flexible in real data analysis.  相似文献   
82.
83.
To capture mean and variance asymmetries and time‐varying volatility in financial time series, we generalize the threshold stochastic volatility (THSV) model and incorporate a heavy‐tailed error distribution. Unlike existing stochastic volatility models, this model simultaneously accounts for uncertainty in the unobserved threshold value and in the time‐delay parameter. Self‐exciting and exogenous threshold variables are considered to investigate the impact of a number of market news variables on volatility changes. Adopting a Bayesian approach, we use Markov chain Monte Carlo methods to estimate all unknown parameters and latent variables. A simulation experiment demonstrates good estimation performance for reasonable sample sizes. In a study of two international financial market indices, we consider two variants of the generalized THSV model, with US market news as the threshold variable. Finally, we compare models using Bayesian forecasting in a value‐at‐risk (VaR) study. The results show that our proposed model can generate more accurate VaR forecasts than can standard models.  相似文献   
84.
Summary. The paper develops methods for the design of experiments for mechanistic models when the response must be transformed to achieve symmetry and constant variance. The power transformation that is used is partially justified by a rule in analytical chemistry. Because of the nature of the relationship between the response and the mechanistic model, it is necessary to transform both sides of the model. Expressions are given for the parameter sensitivities in the transformed model and examples are given of optimum designs, not only for single-response models, but also for experiments in which multivariate responses are measured and for experiments in which the model is defined by a set of differential equations which cannot be solved analytically. The extension to designs for checking models is discussed.  相似文献   
85.
Summary.  Wavelet shrinkage is an effective nonparametric regression technique, especially when the underlying curve has irregular features such as spikes or discontinuities. The basic idea is simple: take the discrete wavelet transform of data consisting of a signal corrupted by noise; shrink or remove the wavelet coefficients to remove the noise; then invert the discrete wavelet transform to form an estimate of the true underlying curve. Various researchers have proposed increasingly sophisticated methods of doing this by using real-valued wavelets. Complex-valued wavelets exist but are rarely used. We propose two new complex-valued wavelet shrinkage techniques: one based on multiwavelet style shrinkage and the other using Bayesian methods. Extensive simulations show that our methods almost always give significantly more accurate estimates than methods based on real-valued wavelets. Further, our multiwavelet style shrinkage method is both simpler and dramatically faster than its competitors. To understand the excellent performance of this method we present a new risk bound on its hard thresholded coefficients.  相似文献   
86.
Summary.  We estimate cause–effect relationships in empirical research where exposures are not completely controlled, as in observational studies or with patient non-compliance and self-selected treatment switches in randomized clinical trials. Additive and multiplicative structural mean models have proved useful for this but suffer from the classical limitations of linear and log-linear models when accommodating binary data. We propose the generalized structural mean model to overcome these limitations. This is a semiparametric two-stage model which extends the structural mean model to handle non-linear average exposure effects. The first-stage structural model describes the causal effect of received exposure by contrasting the means of observed and potential exposure-free outcomes in exposed subsets of the population. For identification of the structural parameters, a second stage 'nuisance' model is introduced. This takes the form of a classical association model for expected outcomes given observed exposure. Under the model, we derive estimating equations which yield consistent, asymptotically normal and efficient estimators of the structural effects. We examine their robustness to model misspecification and construct robust estimators in the absence of any exposure effect. The double-logistic structural mean model is developed in more detail to estimate the effect of observed exposure on the success of treatment in a randomized controlled blood pressure reduction trial with self-selected non-compliance.  相似文献   
87.
Boundary Spaces     
While shows like The X-Files and 24 have merged conspiracy theories with popular science (fictions), some video games have been pushing the narrative even further. Electronic Art's Majestic game was released in July 2001 and quickly generated media buzz with its unusual multi-modal gameplay. Mixing phone calls, faxes, instant messaging, real and "fake' websites, and email, the game provides a fascinating case of an attempt at new directions for gaming communities. Through story, mode of playing, and use of technology, Majestic highlights the uncertain status of knowledge, community and self in a digital age; at the same time, it allows examination of alternative ways of understanding games' role and purpose in the larger culture. Drawing on intricate storylines involving government conspiracies, techno-bio warfare, murder and global terror, players were asked to solve mysteries in the hopes of preventing a devastating future of domination. Because the game drew in both actual and Majestic-owned/-designed websites, it constantly pushed those playing the game right to borders where simulation collides with " factuality'. Given the wide variety of "legitimate' conspiracy theory, alien encounters and alternative science web pages, users often could not distinguish when they were leaving the game's pages and venturing into " real' World Wide Web sites. Its further use of AOL's instant messenger system, in which gamers spoke not only to bots but to other players, pushed users to evaluate constantly both the status of those they were talking to and the information being provided. Additionally, the game required players to occupy unfamiliar subject positions, ones where agency was attenuated, and which subsequently generated a multi-layered sense of unease among players. This mix of authentic and staged information in conjunction with technologically mediated roles highlights what are often seen as phenomenon endemic to the Internet itself; that is, the destabilization of categories of knowing, relating, and being.  相似文献   
88.
A mechanistic model is presented describing the clearance of a compound in a precision-cut liver slice that is incubated in a culture medium. The problem of estimating metabolic rate constants in PBPK models from liver slice experiments is discussed using identifiability analysis. From the identifiability problem analysis, it appears that in addition to the clearance, the compound's free fraction in the slice and the diffusion rate of the exchange of the compound between culture medium and liver slice should be identified. In addition, knowledge of the culture medium volume, the slice volume, the compound's free fraction, and octanol-water-based partition between medium and slice is presupposed. The formal solution for identification is discussed from the perspective of experimental practice. A formally necessary condition for identification is the sampling of parent compound in liver slice or culture medium. However, due to experimental limitations and errors, sampling the parent compound in the slice together with additional sampling of metabolite pooled from the medium and the slice is required for identification in practice. Moreover, it appears that identification results are unreliable when the value of the intrinsic clearance exceeds the value of the diffusion coefficient, a condition to be verified a posteriori.  相似文献   
89.
90.
Determining the size and demographiccharacteristics of substance abuse populationsis extremely important for implementing publicpolicies aimed at the control of substanceabuse. Such information not only assists in theallocation of limited treatment resources bythe state, but also in the monitoring ofsubstance abuse trends over time and in theevaluation of innovative policy initiatives. Inthis study, we develop three composite measuresof treatment need. We then use these measuresto estimate treatment need for alcohol abuseand for controlled substance abuse within eachof Florida's 67 counties. This study providesan important empirical component of communityplanning, quantifying and, to a limited degree,specifying the level of need for the substanceabuse treatment of community residents. Anadditional benefit is the development of a costeffective and unobtrusive methodology fordetermining empirically when levels of need arechanging so that treatment levels can beadjusted accordingly. With proper use,policymakers can readily employ the methodologydeveloped in this study in Florida andelsewhere to make better-informed decisions inthe allocation of finite substance abusetreatment resources.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号