首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   821篇
  免费   23篇
管理学   141篇
民族学   7篇
人口学   29篇
丛书文集   1篇
理论方法论   86篇
综合类   6篇
社会学   357篇
统计学   217篇
  2023年   6篇
  2022年   6篇
  2021年   5篇
  2020年   15篇
  2019年   22篇
  2018年   16篇
  2017年   27篇
  2016年   18篇
  2015年   21篇
  2014年   30篇
  2013年   146篇
  2012年   33篇
  2011年   45篇
  2010年   38篇
  2009年   33篇
  2008年   30篇
  2007年   40篇
  2006年   23篇
  2005年   35篇
  2004年   22篇
  2003年   33篇
  2002年   23篇
  2001年   19篇
  2000年   14篇
  1999年   11篇
  1998年   10篇
  1997年   7篇
  1996年   14篇
  1995年   2篇
  1994年   10篇
  1993年   5篇
  1992年   5篇
  1991年   7篇
  1990年   10篇
  1989年   9篇
  1988年   4篇
  1987年   7篇
  1986年   6篇
  1985年   6篇
  1984年   6篇
  1983年   5篇
  1982年   2篇
  1980年   3篇
  1979年   4篇
  1978年   2篇
  1977年   3篇
  1976年   2篇
  1975年   1篇
  1974年   1篇
  1973年   1篇
排序方式: 共有844条查询结果,搜索用时 31 毫秒
301.
302.
This paper analyses whether the German corporate governance is converging towards Anglo-American practices. We summarise the extant empirical evidence on the various governance mechanisms that economic theory suggests ensure efficiency and describe recent legal developments. We find no clear signs of convergence in form, i.e. the main distinctive features of the German system have remained largely unaltered. However, changes occurred over the last decade (specially in the legal framework) suggest a certain convergence in function, i.e. some governance mechanisms have effectively incorporated aims and/or goals generally associated with the Anglo-American model.
Luc RenneboogEmail:

Marc Goergen   has a degree in economics from the Free University of Brussels, an MBA from Solvay Business School (Brussels) and a DPhil from the University of Oxford. He has held appointments at UMIST, and the Universities of Manchester and Reading. He holds a chair in finance at the University of Sheffield. His research interests are in international corporate governance, mergers & acquisitions, dividend policy, corporate investment models, insider trading and initial public offerings. Marc has widely published in academic journals such as European Financial Management, the Journal of Corporate Finance, the Journal of Finance, the Journal of Financial Intermediation and the Journal of Law, Economics & Organization. He has also contributed chapters to numerous books and written two books (Corporate Governance and Financial Performance published by Edward Elgar and Dividend Policy and Corporate Governance by Oxford University Press). Marc is a Research Associate of the European Corporate Governance Institute. Miguel C. Manjon   is Associate Professor at the Department of Economics, Rovira i Virgili University (Spain). He has also held visiting positions at the Netherlands Bureau for Economic Policy Analysis and the Universities of Warwick (UK) and Tilburg (the Netherlands). His research interests include corporate governance and industrial organization. He has published in Applied Economics, Empirica, European Journal of Law and Economics, Journal of Theoretical and Institutional Economics, International Review of Law and Economics and Small Business Economics, among others. Luc Renneboog   is Professor of Corporate Finance at Tilburg University. He graduated from the Catholic University of Leuven with degrees in management engineering (MSc) and in philosophy (BA), from the University of Chicago with an MBA, and from the London Business School with a PhD in financial economics. He held appointments at the University of Leuven and Oxford University, and visiting appointments throughout Europe. He has published in the J. of Finance, J. of Financial Intermediation, Journal of Law and Economics, and others. His research interests are corporate finance, corporate governance, dividend policy, insider trading, law and economics, and the economics of art.  相似文献   
303.
This paper treats the problem of comparing different evaluations of procedures which rank the variances of k normal populations. Procedures are evaluated on the basis of appropriate loss functions for a particular goal. The goal considered involves ranking the variances of k independent normal populations when the corresponding population means are unknown. The variances are ranked by selecting samples of size n from each population and using the sample variances to obtain the ranking. Our results extend those of various authors who looked at the narrower problem of evaluating the standard proceduv 3 associated with selecting the smallest of the population variances (see e.g.,P. Somerville (1975)).

Different loss functions (both parametric and non-parametric) appropriate to the particular goal under consideration are proposed. Procedures are evaluated by the performance of their risk over a particular preference zone. The sample size n, the least favorable parametric configuration, and the maximum value of the risk are three quantities studied for each procedure. When k is small these quantities, calculated by numerical simulation, show which loss functions respond better and which respond worse to increases in sample size. Loss functions are compared with one another according to the extent of this response. Theoretical results are given for the case of asymptotically large k. It is shown that for certain cases the error incurred by using these asymptotic results is small when k is only moderately large.

This work is an outgrowth of and extends that of J. Reeves and M.J. Sobel (1987) by comparing procedures on the basis of the sample size (perpopulation) required to obtain various bounds on the associated risk functions. New methodologies are developed to evaluate complete ranking procedures in different settings.  相似文献   
304.
The Lilliefors test, which was developed by Lilliefors (1967), is a well-known test for univariate normality when population parameters are unknown. The main assumption for implementing the test is the independent-data assumption. This paper demonstrates the robustness of the Lilliefors test against equicorrelated observations. More specifically, we show that the null distribution of the Lilliefors test statistic is invariant under the alternate assumption that the observations are equicorrelated.  相似文献   
305.
The use of different measures of similarity between observed vectors for the purposes of classifying or clustering them has been expanding dramatically in recent years. One result of this expansion has been the use of many new similarity measures, designed for the purpose of satisfying various criteria. A noteworthy application involves estimating the relationships between genes using microarray experimental data. We consider the class of ‘correlation-type’ similarity measures. The use of these new measures of similarity suggest that the whole problem needs to be formulated in statistical terms to clarify their relative benefits. Pursuant to this need, we define, for each given observed vector, a baseline representing the ‘true’ value common to each of the component observations. These ‘true’ values are taken to be parameters. We define the ‘true correlation’ between each two observed vectors as the average (over the distribution of the observations for given baseline parameters) of Pearson's correlation with sample means replaced by the corresponding baseline parameters. Estimators of this true correlation are assessed using their mean squared error (MSE). Proper Bayes estimators of this true correlation, being based on the predictive posterior distribution of the data, are both difficult to calculate/analyze and highly non robust. By constrast, empirical Bayes estimators are: (i) close to their Bayesian counterparts; (ii) easy to analyze; and (iii) strongly robust. For these reasons, we employ empirical Bayes estimators of correlation in place of their Bayesian counterparts. We show how to construct two different kinds of simultaneous Bayes correlation estimators: the first assumes no apriori correlation between baseline parameters; the second assumes a common unknown correlation between them. Estimators of the latter type frequently have significantly smaller MSE than those of the former type which, in turn, frequently have significantly smaller MSE than their Pearson estimator counterparts. For purposes of illustrating our results, we examine the problem of inferring the relationships between gene expression level vectors, in the context of observing microarray experimental data.  相似文献   
306.
Postel‐Vinay and Robin's (2002) sequential auction model is extended to allow for aggregate productivity shocks. Workers exhibit permanent differences in ability while firms are identical. Negative aggregate productivity shocks induce job destruction by driving the surplus of matches with low ability workers to negative values. Endogenous job destruction coupled with worker heterogeneity thus provides a mechanism for amplifying productivity shocks that offers an original solution to the unemployment volatility puzzle (Shimer (2005)). Moreover, positive or negative shocks may lead employers and employees to renegotiate low wages up and high wages down when agents' individual surpluses become negative. The model delivers rich business cycle dynamics of wage distributions and explains why both low wages and high wages are more procyclical than wages in the middle of the distribution.  相似文献   
307.
European settlement in Quebec (Canada) began in the early 17th century, with the arrival of French pioneers. After the British Conquest in 1760, immigrants from the British Isles began to settle in some parts of Quebec. Many of these immigrants were Irish Catholics. Historians and genealogists have identified several names of Irish origin in the French Canadian population, and many scholars have wondered about the importance of the integration of Irish migrants and their descendants within this population. The purposes of this study are to identify and characterize the founders of Irish origin to estimate the importance of their genetic contribution to the contemporary Quebec population, and to measure the variability of this contribution according to the founders’ period of arrival and county of origin in Ireland. Data was obtained from a set of 2,223 ascending genealogies going back as far as the early 17th century. The average genealogical depth is a little more than 9 generations, with many branches reaching 16 or 17 generations. Although Irish founders explain less than 1% of the total Quebec gene pool, results show that nearly 21% of the genealogies contain at least one Irish founder. These founders contributed to the peopling of all regions of Quebec, but there are some important variations from one region to another. A majority of the Irish founders immigrated during the 19th century, and most of them came from the counties of Southern Ireland.  相似文献   
308.
Various nonparametric and parametric estimators of extremal dependence have been proposed in the literature. Nonparametric methods commonly suffer from the curse of dimensionality and have been mostly implemented in extreme-value studies up to three dimensions, whereas parametric models can tackle higher-dimensional settings. In this paper, we assess, through a vast and systematic simulation study, the performance of classical and recently proposed estimators in multivariate settings. In particular, we first investigate the performance of nonparametric methods and then compare them with classical parametric approaches under symmetric and asymmetric dependence structures within the commonly used logistic family. We also explore two different ways to make nonparametric estimators satisfy the necessary dependence function shape constraints, finding a general improvement in estimator performance either (i) by substituting the estimator with its greatest convex minorant, developing a computational tool to implement this method for dimensions \(D\ge 2\) or (ii) by projecting the estimator onto a subspace of dependence functions satisfying such constraints and taking advantage of Bernstein–Bézier polynomials. Implementing the convex minorant method leads to better estimator performance as the dimensionality increases.  相似文献   
309.
From practice to theory, we introduce a state-of-the-art stream of papers that promotes an inclusive and complementary consideration of both analytical methods and ethical values in Operations Research and Management Sciences (OR/MS). We suggest a perspective according to which the consideration of ethics in OR/MS constitutes an enrichment of our discipline as well as a contribution to a more sustainable future in general.  相似文献   
310.
In this article a preliminary analysis of the loss of life caused by Hurricane Katrina in the New Orleans metropolitan area is presented. The hurricane caused more than 1,100 fatalities in the state of Louisiana. A preliminary data set that gives information on the recovery locations and individual characteristics for 771 fatalities has been analyzed. One-third of the analyzed fatalities occurred outside the flooded areas or in hospitals and shelters in the flooded area. These fatalities were due to the adverse public health situation that developed after the floods. Two-thirds of the analyzed fatalities were most likely associated with the direct physical impacts of the flood and mostly caused by drowning. The majority of victims were elderly: nearly 60% of fatalities were over 65 years old. Similar to historical flood events, mortality rates were highest in areas near severe breaches and in areas with large water depths. An empirical relationship has been derived between the water depth and mortality and this has been compared with similar mortality functions proposed based on data for other flood events. The overall mortality among the exposed population for this event was approximately 1%, which is similar to findings for historical flood events. Despite the fact that the presented results are preliminary they give important insights into the determinants of loss of life and the relationship between mortality and flood characteristics.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号