首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   6324篇
  免费   214篇
  国内免费   21篇
管理学   464篇
民族学   8篇
人口学   68篇
丛书文集   62篇
理论方法论   75篇
综合类   662篇
社会学   83篇
统计学   5137篇
  2024年   10篇
  2023年   55篇
  2022年   70篇
  2021年   93篇
  2020年   110篇
  2019年   236篇
  2018年   280篇
  2017年   459篇
  2016年   258篇
  2015年   200篇
  2014年   274篇
  2013年   1564篇
  2012年   500篇
  2011年   218篇
  2010年   194篇
  2009年   220篇
  2008年   224篇
  2007年   185篇
  2006年   154篇
  2005年   163篇
  2004年   154篇
  2003年   114篇
  2002年   102篇
  2001年   85篇
  2000年   97篇
  1999年   79篇
  1998年   90篇
  1997年   65篇
  1996年   29篇
  1995年   40篇
  1994年   34篇
  1993年   30篇
  1992年   39篇
  1991年   20篇
  1990年   12篇
  1989年   13篇
  1988年   15篇
  1987年   8篇
  1986年   6篇
  1985年   11篇
  1984年   9篇
  1983年   10篇
  1982年   10篇
  1981年   4篇
  1980年   6篇
  1979年   2篇
  1978年   1篇
  1977年   5篇
  1975年   2篇
排序方式: 共有6559条查询结果,搜索用时 15 毫秒
31.
We analyse whether the psychological pricing in the private sector has a public sector counterpart in tax policy. Analysing the main theoretical arguments for the existence of price points, and applying them to the public sector, suggests that psychological taxing reveals itself by the use of non-0 ending tax rates. The tax rate endings of the local income taxes, which are set by 308 Flemish municipalities in the fiscal year 1998, suggests the presence of psychological taxing. Non-0 endings occur more frequently in municipalities where demand for public policy is more elastic (and where, therefore, the benefits to the politicians from setting a tax just below a tax point is higher). The pre-tax income inequality and the level of the tax rate positively affect psychological taxing. The latter effect is reinforced in those municipalities where the existing tax rate is above the average tax rate in neighbouring municipalities and below their neighbours’ minimum, although this effect has a limited effect and is offset the further below the minimum the tax is set.  相似文献   
32.
The residual standard deviation of a general linear model provides information about predictive accuracy that is not revealed by the multiple correlation or regression coefficients. The classic confidence interval for a residual standard deviation is hypersensitive to minor violations of the normality assumption and its robustness does not improve with increasing sample size. An approximate confidence interval for the residual standard deviation is proposed and shown to be robust to moderate violations of the normality assumption with robustness to extreme non-normality that improves with increasing sample size.  相似文献   
33.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.  相似文献   
34.
FDI对我国经济增长的实证分析   总被引:5,自引:0,他引:5  
就外商直接投资(FD I)对我国经济增长的影响这一焦点问题,国内学者从定性采用多指标和从定量使用传统的基于普通最小二乘法(O LS)下的计量经济模型两个方面进行了比较详尽的研究。本文在现有文献基础上进行了尝试性创新,使用O LS法进行回归分析时,考虑了数据的平稳性问题,进行了数据的平稳性检验,并运用科克伦-奥克特(C ochrane-O rcutt法)方法纠正了可能存在的虚假回归现象,从而使估计结果更加稳健,结论也更加合理。  相似文献   
35.
提高神经网络模型推广能力的关键是控制模型的复杂度。该文探索了贝叶斯神经网络的非参数回归的建模方法,通过融入模型参数的先验知识,在给定数据样本及模型假设下进行后验概率的贝叶斯推理,使用马尔可夫链蒙特卡罗算法来优化模型控制参数,实现了对神经网络模型中不同部分复杂度的控制,获得了模型参数的后验分布及预测分布。在5个含噪二维函数回归问题上的应用显示了模型的复杂度能根据数据的复杂度而自适应调整,并给出了较好的预测结果。  相似文献   
36.
Summary. Semiparametric mixed models are useful in biometric and econometric applications, especially for longitudinal data. Maximum penalized likelihood estimators (MPLEs) have been shown to work well by Zhang and co-workers for both linear coefficients and nonparametric functions. This paper considers the role of influence diagnostics in the MPLE by extending the case deletion and subject deletion analysis of linear models to accommodate the inclusion of a nonparametric component. We focus on influence measures for the fixed effects and provide formulae that are analogous to those for simpler models and readily computable with the MPLE algorithm. We also establish an equivalence between the case or subject deletion model and a mean shift outlier model from which we derive tests for outliers. The influence diagnostics proposed are illustrated through a longitudinal hormone study on progesterone and a simulated example.  相似文献   
37.
While most of epidemiology is observational, rather than experimental, the culture of epidemiology is still derived from agricultural experiments, rather than other observational fields, such as astronomy or economics. The mismatch is made greater as focus has turned to continue risk factors, multifactorial outcomes, and outcomes with large variation unexplainable by available risk factors. The analysis of such data is often viewed as hypothesis testing with statistical control replacing randomization. However, such approaches often test restricted forms of the hypothesis being investigated, such as the hypothesis of a linear association, when there is no prior empirical or theoretical reason to believe that if an association exists, it is linear. In combination with the large nonstochastic sources of error in such observational studies, this suggests the more flexible alternative of exploring the association. Conclusions on the possible causal nature of any discovered association will rest on the coherence and consistency of multiple studies. Nonparametric smoothing in general, and generalized additive models in particular, represent an attractive approach to such problems. This is illustrated using data examining the relationship between particulate air pollution and daily mortality in Birmingham, Alabama; between particulate air pollution, ozone, and SO2 and daily hospital admissions for respiratory illness in Philadelphia; and between ozone and particulate air pollution and coughing episodes in children in six eastern U.S. cities. The results indicate that airborne particles and ozone are associated with adverse health outcomes at very low concentrations, and that there are likely no thresholds for these relationships.  相似文献   
38.
通过建立多元线性回归的数学模型,利用最小二乘估计得到正规方程组并进行相关性检验,从而解决相关实际问题。  相似文献   
39.
Multivariate model validation is a complex decision-making problem involving comparison of multiple correlated quantities, based upon the available information and prior knowledge. This paper presents a Bayesian risk-based decision method for validation assessment of multivariate predictive models under uncertainty. A generalized likelihood ratio is derived as a quantitative validation metric based on Bayes’ theorem and Gaussian distribution assumption of errors between validation data and model prediction. The multivariate model is then assessed based on the comparison of the likelihood ratio with a Bayesian decision threshold, a function of the decision costs and prior of each hypothesis. The probability density function of the likelihood ratio is constructed using the statistics of multiple response quantities and Monte Carlo simulation. The proposed methodology is implemented in the validation of a transient heat conduction model, using a multivariate data set from experiments. The Bayesian methodology provides a quantitative approach to facilitate rational decisions in multivariate model assessment under uncertainty.  相似文献   
40.
Summary.  A fully Bayesian analysis of directed graphs, with particular emphasis on applica- tions in social networks, is explored. The model is capable of incorporating the effects of covariates, within and between block ties and multiple responses. Inference is straightforward by using software that is based on Markov chain Monte Carlo methods. Examples are provided which highlight the variety of data sets that can be entertained and the ease with which they can be analysed.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号