首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   568篇
  免费   5篇
管理学   106篇
民族学   5篇
人才学   1篇
人口学   23篇
丛书文集   1篇
理论方法论   17篇
综合类   2篇
社会学   92篇
统计学   326篇
  2023年   6篇
  2022年   14篇
  2021年   8篇
  2020年   25篇
  2019年   28篇
  2018年   40篇
  2017年   41篇
  2016年   26篇
  2015年   21篇
  2014年   11篇
  2013年   165篇
  2012年   15篇
  2011年   13篇
  2010年   14篇
  2009年   16篇
  2008年   11篇
  2007年   16篇
  2006年   5篇
  2005年   12篇
  2004年   5篇
  2003年   6篇
  2002年   2篇
  2001年   3篇
  2000年   7篇
  1999年   2篇
  1998年   3篇
  1997年   4篇
  1995年   3篇
  1994年   2篇
  1992年   6篇
  1991年   3篇
  1990年   3篇
  1989年   1篇
  1988年   2篇
  1987年   1篇
  1986年   1篇
  1985年   4篇
  1984年   4篇
  1983年   3篇
  1982年   4篇
  1981年   1篇
  1980年   2篇
  1979年   2篇
  1978年   3篇
  1976年   4篇
  1975年   1篇
  1972年   1篇
  1971年   2篇
  1970年   1篇
排序方式: 共有573条查询结果,搜索用时 19 毫秒
1.
Recently, we developed a GIS-Integrated Integral Risk Index (IRI) to assess human health risks in areas with presence of environmental pollutants. Contaminants were previously ranked by applying a self-organizing map (SOM) to their characteristics of persistence, bioaccumulation, and toxicity in order to obtain the Hazard Index (HI). In the present study, the original IRI was substantially improved by allowing the entrance of probabilistic data. A neuroprobabilistic HI was developed by combining SOM and Monte Carlo analysis. In general terms, the deterministic and probabilistic HIs followed a similar pattern: polychlorinated biphenyls (PCBs) and light polycyclic aromatic hydrocarbons (PAHs) were the pollutants showing the highest and lowest values of HI, respectively. However, the bioaccumulation value of heavy metals notably increased after considering a probability density function to explain the bioaccumulation factor. To check its applicability, a case study was investigated. The probabilistic integral risk was calculated in the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain), where an environmental program has been carried out since 2002. The risk change between 2002 and 2005 was evaluated on the basis of probabilistic data of the levels of various pollutants in soils. The results indicated that the risk of the chemicals under study did not follow a homogeneous tendency. However, the current levels of pollution do not mean a relevant source of health risks for the local population. Moreover, the neuroprobabilistic HI seems to be an adequate tool to be taken into account in risk assessment processes.  相似文献   
2.
A Bayesian approach is presented for detecting influential observations using general divergence measures on the posterior distributions. A sampling-based approach using a Gibbs or Metropolis-within-Gibbs method is used to compute the posterior divergence measures. Four specific measures are proposed, which convey the effects of a single observation or covariate on the posterior. The technique is applied to a generalized linear model with binary response data, an overdispersed model and a nonlinear model. An asymptotic approximation using Laplace method to obtain the posterior divergence is also briefly discussed.  相似文献   
3.
Many journalists and scholars will look for the correlation of that chain of spectacular transformations that changed, as if at one blow, the fates of tens of millions of individuals and the hitherto firm bipolar picture of the modern world ... Today, many people are talking and writing about the role of the intellectuals, students, and the theatre, or the influence of the Soviets' perestroika, and economic difficulties. They're right. I myself as a playwright would also add the influence of humour and honesty, and perhaps even something beyond us, something maybe even unearthly.Vaclav Havel, Preface to Gwyn Prins (editor),Spring in Winter: The 1989 Revolutions.  相似文献   
4.
5.
Goal programming (GP) is designed to resolve allocation problems with conflicting goals. Both goals and constraints are incorporated in the allocational decision, and the objective function is stated in a way that, upon solution, yields a result “as close as possible” to the priority-weighted goals. The present paper applies GP methodology to the investment decision of dual-purpose funds (DPFs), that are required by law to pursue allocational decisions with potentially conflicting objectives. It provides an empirical demonstration that DPF managers could have improved their investment selection and subsequent performance by the use of GP methodology. Finally the paper stresses the importance of sensitivity analysis to improve both the goal-ranking and target-selection aspects of the methodology and provides a limited but illuminating empirical demonstration of post-optimality analysis.  相似文献   
6.
The main aim of this study is to investigate India's demand for international reserve by focusing on the role of national monetary disequilibrium and to present new benchmarks for assessing the adequacy of international reserves. We assessed India's position in terms of reserve adequacy and found that India is well placed and has sufficient stock of international reserves to meet the minimum adequacy requirements. Also, the results reveal that the central bank is holding substantial excess reserves and the related opportunity cost (1.5% of GDP) appears to be quite considerable. Further, the estimates of reserve demand function suggest that scale of foreign trade, uncertainty and profitability considerations play significant role in determining India's long-term reserve demand policies. More importantly, validating the monetary approach to balance of payment, our results show that national monetary disequilibrium does play a crucial role in short-run reserve movements. An excess of money demand (supply) induces an inflow (outflow) of international reserves with an elasticity of 0.56 which also implies that Reserve Bank of India responds to correct the domestic money market disequilibrium; and did not just leave it completely on the mercy of reserve inflows.  相似文献   
7.
In this paper we propose a new lifetime model for multivariate survival data in presence of surviving fractions and examine some of its properties. Its genesis is based on situations in which there are m types of unobservable competing causes, where each cause is related to a time of occurrence of an event of interest. Our model is a multivariate extension of the univariate survival cure rate model proposed by Rodrigues et al. [37 J. Rodrigues, V.G. Cancho, M. de Castro, and F. Louzada-Neto, On the unification of long-term survival models, Statist. Probab. Lett. 79 (2009), pp. 753759. doi: 10.1016/j.spl.2008.10.029[Crossref], [Web of Science ®] [Google Scholar]]. The inferential approach exploits the maximum likelihood tools. We perform a simulation study in order to verify the asymptotic properties of the maximum likelihood estimators. The simulation study also focus on size and power of the likelihood ratio test. The methodology is illustrated on a real data set on customer churn data.  相似文献   
8.
9.
This paper addresses the problems of frequentist and Bayesian estimation for the unknown parameters of generalized Lindley distribution based on lower record values. We first derive the exact explicit expressions for the single and product moments of lower record values, and then use these results to compute the means, variances and covariance between two lower record values. We next obtain the maximum likelihood estimators and associated asymptotic confidence intervals. Furthermore, we obtain Bayes estimators under the assumption of gamma priors on both the shape and the scale parameters of the generalized Lindley distribution, and associated the highest posterior density interval estimates. The Bayesian estimation is studied with respect to both symmetric (squared error) and asymmetric (linear-exponential (LINEX)) loss functions. Finally, we compute Bayesian predictive estimates and predictive interval estimates for the future record values. To illustrate the findings, one real data set is analyzed, and Monte Carlo simulations are performed to compare the performances of the proposed methods of estimation and prediction.  相似文献   
10.
This paper introduces a double and group acceptance sampling plans based on time truncated lifetimes when the lifetime of an item follows the inverse log-logistic (ILL) distribution with known shape parameter. The operating characteristic function and average sample number (ASN) values of the double acceptance sampling inspection plan are provided. The values of the minimum number of groups and operating characteristic function for various quality levels are obtained for a group acceptance sampling inspection plan. A comparative study between single acceptance sampling inspection plan and double acceptance sampling inspection plan is carried out in terms of sample size. One simulated example and four real-life examples are discussed to show the applicability of the proposed double and group acceptance sampling inspection plans for ILL distributed quality parameters.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号