首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   537篇
  免费   6篇
管理学   43篇
民族学   6篇
人口学   35篇
丛书文集   1篇
理论方法论   24篇
综合类   3篇
社会学   114篇
统计学   317篇
  2023年   6篇
  2022年   6篇
  2021年   8篇
  2020年   20篇
  2019年   20篇
  2018年   30篇
  2017年   44篇
  2016年   21篇
  2015年   8篇
  2014年   16篇
  2013年   152篇
  2012年   14篇
  2011年   4篇
  2010年   16篇
  2009年   16篇
  2008年   9篇
  2007年   6篇
  2006年   5篇
  2005年   10篇
  2004年   5篇
  2003年   6篇
  2002年   12篇
  2001年   9篇
  2000年   6篇
  1999年   11篇
  1998年   4篇
  1997年   4篇
  1996年   6篇
  1995年   4篇
  1992年   5篇
  1991年   6篇
  1990年   1篇
  1989年   3篇
  1988年   3篇
  1987年   2篇
  1985年   2篇
  1984年   3篇
  1983年   2篇
  1982年   10篇
  1981年   3篇
  1980年   6篇
  1979年   3篇
  1978年   4篇
  1977年   4篇
  1976年   2篇
  1975年   1篇
  1973年   1篇
  1972年   2篇
  1969年   1篇
  1965年   1篇
排序方式: 共有543条查询结果,搜索用时 0 毫秒
481.
Balanced Confidence Regions Based on Tukey's Depth and the Bootstrap   总被引:1,自引:0,他引:1  
We propose and study the bootstrap confidence regions for multivariate parameters based on Tukey's depth. The bootstrap is based on the normalized or Studentized statistic formed from an independent and identically distributed random sample obtained from some unknown distribution in R q . The bootstrap points are deleted on the basis of Tukey's depth until the desired confidence level is reached. The proposed confidence regions are shown to be second order balanced in the context discussed by Beran. We also study the asymptotic consistency of Tukey's depth-based bootstrap confidence regions. The applicability of the method proposed is demonstrated in a simulation study.  相似文献   
482.
This paper considers the problem of estimating a cumulative distribution function (cdf), when it is known a priori to dominate a known cdf. The estimator considered is obtained by adjusting the empirical cdf using the prior information. This adjusted estimator is shown to be consistent, its limiting distribution is found, and its mean squared error (MSE) is shown to be smaller than the MSE of the empirical cdf. Its asymptotic efficiency (compared to the empirical cdf) is also found.  相似文献   
483.
The present study utilizes an operational model as well as simple empirical relationships for estimating hazard zones due to fire, explosion, and toxic vapor cloud dispersion. The empirical relationships are based on giving appropriate weightage to each of the parameters on which the hazard in question (viz, fire, explosion, toxic vapour dispersion) is dependent. Results from these two approaches [i.e., an operational model FLAMCALC of U.K. Health and Safety Executive (HSE) and an empirical model named FIREX] have been compared with the data obtained from the Mexico City disaster in 1984. In general, results from the empirical approach and FLAMCALC are comparable to the observed effects.  相似文献   
484.
Heavy gas dispersion models have been developed at IIT (hereinafter referred as IIT heavy gas models I and II) with a view to estimate vulnerable zones due to accidental (both instantaneous and continuous, respectively) release of dense toxic material in the atmosphere. The results obtained from IIT heavy gas models have been compared with those obtained from the DEGADIS model [Dense Gas Dispersion Model, developed by Havens and Spicer (1985) for the U.S. Coast Guard] as well as with the observed data collected during the Burro Series, Maplin Sands, and Thorney Island field trials. Both of these models include relevant features of dense gas dispersion, viz., gravity slumping, air entrainment, cloud heating, and transition to the passive phase, etc. The DEGADIS model has been considered for comparing the performance of IIT heavy gas models in this study because it incorporates most of the physical processes of dense gas dispersion in an elaborate manner, and has also been satisfactorily tested against field observations. The predictions from IIT heavy gas models indicate a fairly similar trend to the observed values from Thorney Island, Burro Series, and Maplin experiments with a tendency toward overprediction. There is a good agreement between the prediction of IIT Heavy Gas models I and II with those from DEGADIS, except for the simulations of IIT heavy gas model-I pertaining to very large release quantities under highly stable atmospheric conditions. In summary, the performance of IIT heavy gas models have been found to be reasonably good both with respect to the limited field data available and various simulations (selected on the basis of relevant storages in the industries and prevalent meteorological conditions performed with DEGADIS). However, there is a scope of improvement in the IIT heavy gas models (viz., better formulation for entrainment, modification of coefficients, transition criteria, etc.). Further, isotons (nomograms) have been prepared by using IIT heavy gas models for chlorine, which provide safe distance for various storage amounts for 24 meteorological scenarios prevalent in the entire year. These nomograms are prepared such that a nonspecialist can use them easily for control and management in case of an emergency requiring the evacuation of people in the affected region. These results can also be useful for siting and limiting the storage quantities.  相似文献   
485.
A non-normal class of distribution (Edgeworth Series distribution) function in three and four parameters has been considered for dose-binary response relationship. This class accounts for the non-normality (expressed in terms of skewness and kurtosis) present in the relationship in addition to the usual location and scale parameters (generally considered by two parameter models). We present the maximum likelihood method of estimation of the parameters and test of probit (normal distribution) hypothesis. Edgeworth Series distribution when fitted to the data of Milicer & Szczotka (1966) showed an excellent closeness to the observed values, significant improvement over probit and logit fit (Aranda-Ordaz, 1981), and better fit compared to Prentice (1976) model.  相似文献   
486.
Inferences for survival curves based on right censored continuous or grouped data are studied. Testing homogeneity with an ordered restricted alternative and testing the order restriction as the null hypothesis are considered. Under a proportional hazards model, the ordering on the survival curves corresponds to an ordering on the regression coefficients. Approximate likelihood methods are obtained by applying order restricted procedures to the estimates of the regression coefficients. Ordered analogues to the log rank test which are based on the score statistics are considered also. Chi-bar-squared distributions, which have been studied extensively, are shown to provide reasonable approximations to the null distributions of these tests statistics. Using Monte Carlo techniques, the powers of these two types of tests are compared with those that are available in the literature.  相似文献   
487.
This paper investigates the general linear regression model Y = Xβ+e assuming the dependent variable is observed as a scrambled response using Eichhorn & Hayre's (1983) approach to collecting sensitive personal information. The estimates of the parameters in the model remain unbiased, but the variances of the estimates increase due to scrambling. The Wald test of the null hypothesis H0: β=β0, against the alternative hypothesis Ha: β#β0, is also investigated. Parameter estimates obtained from scrambled responses are compared to those from conventional or direct-question surveys, using simulation. The coverage by nominal 95% confidence intervals is also reported.  相似文献   
488.
Singh and Sukhatme [4] have considered the problem of optimum stratification on an auxiliary variable x when the units from the different strata are selected with probability proportional to the value of the auxiliary variable and the sample sizes for the different strata are determined by using Neyman allocation method. The present paper considers the same problem for the proportional and equal allocation methods. The rules for finding approximately optimum strata boundaries for these two allocation methods have been given. An investigation into the relative efficiency of these allocation methods with respect to the Neyman allocation has also been made. The performance of equal allocation is found to be better than that of proportional allocation and practically equivalent to the Neyman allocation.  相似文献   
489.
In this article, we propose a new class of distribution which is based on the concept of exponentiated generalization with some modification so as to provide a better result in terms of flexibility. Our proposed distribution accommodates various shapes of hazard rate including the bathtub. Exponential distribution has been taken as the baseline distribution. Various statistical properties of the proposed distribution have been studied. We have used the method of maximum likelihood for estimation of the parameters of the proposed model. Last, we have analyzed four real datasets to illustrate the flexibility of the model in comparison to eight existing well-known distributions.  相似文献   
490.
To deal with the problems of non-response, one-parameter classes of imputation techniques have been suggested and their corresponding point estimators have been proposed. The proposed classes of estimators include several other estimators as a particular case for different values of the parameter. A design based approach is used to compare the proposed strategy with the existing strategies. Theoretical results have been verified through simulation studies handling real data examples.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号