首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   11022篇
  免费   371篇
  国内免费   136篇
管理学   754篇
劳动科学   35篇
民族学   263篇
人才学   8篇
人口学   246篇
丛书文集   2917篇
理论方法论   474篇
综合类   5040篇
社会学   794篇
统计学   998篇
  2024年   12篇
  2023年   55篇
  2022年   182篇
  2021年   187篇
  2020年   132篇
  2019年   134篇
  2018年   192篇
  2017年   273篇
  2016年   194篇
  2015年   367篇
  2014年   387篇
  2013年   690篇
  2012年   655篇
  2011年   763篇
  2010年   835篇
  2009年   796篇
  2008年   778篇
  2007年   790篇
  2006年   788篇
  2005年   686篇
  2004年   373篇
  2003年   283篇
  2002年   362篇
  2001年   305篇
  2000年   203篇
  1999年   218篇
  1998年   131篇
  1997年   134篇
  1996年   128篇
  1995年   120篇
  1994年   73篇
  1993年   66篇
  1992年   38篇
  1991年   39篇
  1990年   33篇
  1989年   34篇
  1988年   23篇
  1987年   14篇
  1986年   13篇
  1985年   15篇
  1984年   5篇
  1983年   5篇
  1981年   4篇
  1980年   4篇
  1978年   2篇
  1977年   1篇
  1976年   3篇
  1974年   1篇
  1973年   1篇
  1970年   1篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
91.
The p-value-based adjustment of individual endpoints and the global test for an overall inference are the two general approaches for the analysis of multiple endpoints. Statistical procedures developed for testing multivariate outcomes often assume that the multivariate endpoints are either independent or normally distributed. This paper presents a general approach for the analysis of multivariate binary data under the framework of generalized linear models. The generalized estimating equations (GEE) approach is applied to estimate the correlation matrix of the test statistics using the identity and exchangeable working correlation matrices with the model-based as well as robust estimators. The objectives of the approaches are the adjustment of p-values of individual endpoints to identify the affected endpoints as well as the global test of an overall effect. A Monte Carlo simulation was conducted to evaluate the overall family wise error (FWE) rates of the single-step down p-value adjustment approach from two adjustment methods to three global test statistics. The p-value adjustment approach seems to control the FWE better than the global approach Applications of the proposed methods are illustrated by analyzing a carcinogenicity experiment designed to study the dose response trend for 10 tumor sites, and a developmental toxicity experiment with three malformation types: external, visceral, and skeletal.  相似文献   
92.
When the X ¥ control chart is used to monitor a process, three parameters should be determined: the sample size, the sampling interval between successive samples, and the control limits of the chart. Duncan presented a cost model to determine the three parameters for an X ¥ chart. Alexander et al. combined Duncan's cost model with the Taguchi loss function to present a loss model for determining the three parameters. In this paper, the Burr distribution is employed to conduct the economic-statistical design of X ¥ charts for non-normal data. Alexander's loss model is used as the objective function, and the cumulative function of the Burr distribution is applied to derive the statistical constraints of the design. An example is presented to illustrate the solution procedure. From the results of the sensitivity analyses, we find that small values of the skewness coefficient have no significant effect on the optimal design; however, a larger value of skewness coefficient leads to a slightly larger sample size and sampling interval, as well as wider control limits. Meanwhile, an increase on the kurtosis coefficient results in an increase on the sample size and wider control limits.  相似文献   
93.
The authors give easy‐to‐check sufficient conditions for the geometric ergodicity and the finiteness of the moments of a random process xt = ?(xt‐1,…, xt‐p) + ?tσ(xt‐1,…, xt‐q) in which ?: Rp → R, σ Rq → R and (?t) is a sequence of independent and identically distributed random variables. They deduce strong mixing properties for this class of nonlinear autoregressive models with changing conditional variances which includes, among others, the ARCH(p), the AR(p)‐ARCH(p), and the double‐threshold autoregressive models.  相似文献   
94.
Conditional variance estimation in heteroscedastic regression models   总被引:1,自引:0,他引:1  
First, we propose a new method for estimating the conditional variance in heteroscedasticity regression models. For heavy tailed innovations, this method is in general more efficient than either of the local linear and local likelihood estimators. Secondly, we apply a variance reduction technique to improve the inference for the conditional variance. The proposed methods are investigated through their asymptotic distributions and numerical performances.  相似文献   
95.
We propose optimal procedures to achieve the goal of partitioning k multivariate normal populations into two disjoint subsets with respect to a given standard vector. Definition of good or bad multivariate normal populations is given according to their Mahalanobis distances to a known standard vector as being small or large. Partitioning k multivariate normal populations is reduced to partitioning k non-central Chi-square or non-central F distributions with respect to the corresponding non-centrality parameters depending on whether the covariance matrices are known or unknown. The minimum required sample size for each population is determined to ensure that the probability of correct decision attains a certain level. An example is given to illustrate our procedures.  相似文献   
96.
This article explores the problem of designing a CSP-1 plan with the specified average outgoing quality limit (AOQL), the acceptable quality level (AQL), and the limiting quality level (LQL) value. By adopting the regret-balanced criterion under the producer's and consumer's interests of quality, we can design the optimal CSP-1 plan.  相似文献   
97.
We propose a unified approach to the estimation of regression parameters under double-sampling designs, in which a primary sample consisting of data on the rough or proxy measures for the response and/or explanatory variables as well as a validation subsample consisting of data on the exact measurements are available. We assume that the validation sample is a simple random subsample from the primary sample. Our proposal utilizes a specific parametric model to extract the partial information contained in the primary sample. The resulting estimator is consistent even if such a model is misspecified, and it achieves higher asymptotic efficiency than the estimator based only on the validation data. Specific cases are discussed to illustrate the application of the estimator proposed.  相似文献   
98.
Land subsidence risk assessment (LSRA) is a multi‐attribute decision analysis (MADA) problem and is often characterized by both quantitative and qualitative attributes with various types of uncertainty. Therefore, the problem needs to be modeled and analyzed using methods that can handle uncertainty. In this article, we propose an integrated assessment model based on the evidential reasoning (ER) algorithm and fuzzy set theory. The assessment model is structured as a hierarchical framework that regards land subsidence risk as a composite of two key factors: hazard and vulnerability. These factors can be described by a set of basic indicators defined by assessment grades with attributes for transforming both numerical data and subjective judgments into a belief structure. The factor‐level attributes of hazard and vulnerability are combined using the ER algorithm, which is based on the information from a belief structure calculated by the Dempster‐Shafer (D‐S) theory, and a distributed fuzzy belief structure calculated by fuzzy set theory. The results from the combined algorithms yield distributed assessment grade matrices. The application of the model to the Xixi‐Chengnan area, China, illustrates its usefulness and validity for LSRA. The model utilizes a combination of all types of evidence, including all assessment information—quantitative or qualitative, complete or incomplete, and precise or imprecise—to provide assessment grades that define risk assessment on the basis of hazard and vulnerability. The results will enable risk managers to apply different risk prevention measures and mitigation planning based on the calculated risk states.  相似文献   
99.
丁飞鹏  陈建宝 《统计研究》2019,36(3):113-123
本文将最小二乘支持向量机(LSSVM) 和二次推断函数法(QIF) 相结合,为个体内具有相关结构的固定效应部分线性变系数面板模型提供了一种新的快速估计方法;在一定的正则条件下,论证了参数估计量的渐近正态性和非参数估计量的收敛速度;采用Monte Carlo模拟考察了估计方法在有限样本下的表现并将估计技术应用于现实数据分析。该方法不仅保证了估计的有效性和统计推断力,而且程序运行速度得到较大幅度提升。  相似文献   
100.
ABSTRACT

This paper mainly focuses on the development of disaster social work in Mainland China and the intervention of social work in disaster relief. Before the Wenchuan earthquake and in the initial stage of post-earthquake, disaster social work was mainly based on individual psychotherapy; from the earthquake to the year of 2012 in which post-earthquake recovery and reconstruction was completed, the disaster relief began to emphasise community building and integration, while the basic framework for disaster social work was also established. Social workers begin to explore the new mode of developmental and localised disaster social work. By combining with the practical experiences from disaster social work, this paper tries to highlight the dilemmas confronting disaster relief in Mainland China and put forward some corresponding countermeasures and suggestions, which could improve the future disaster relief system in Mainland China.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号