首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   381篇
  免费   7篇
  国内免费   1篇
管理学   96篇
人才学   1篇
人口学   10篇
丛书文集   22篇
理论方法论   14篇
综合类   63篇
社会学   24篇
统计学   159篇
  2024年   1篇
  2023年   2篇
  2022年   3篇
  2021年   6篇
  2020年   5篇
  2019年   11篇
  2018年   8篇
  2017年   14篇
  2016年   4篇
  2015年   9篇
  2014年   18篇
  2013年   80篇
  2012年   32篇
  2011年   26篇
  2010年   18篇
  2009年   20篇
  2008年   14篇
  2007年   11篇
  2006年   12篇
  2005年   10篇
  2004年   12篇
  2003年   7篇
  2002年   6篇
  2001年   10篇
  2000年   6篇
  1999年   6篇
  1998年   6篇
  1997年   8篇
  1996年   4篇
  1995年   3篇
  1994年   6篇
  1993年   1篇
  1991年   1篇
  1990年   1篇
  1987年   2篇
  1985年   1篇
  1984年   1篇
  1983年   1篇
  1981年   1篇
  1979年   1篇
  1978年   1篇
排序方式: 共有389条查询结果,搜索用时 93 毫秒
81.
印度独立后,总体上对外资持欢迎态度,1991年实行自由化改革以来尤其如此。印度的政治制度、法律制度和经济制度有助于避免大规模政治风险的发生,但国有化和征收风险、汇兑风险、政府违约风险、战争和内乱风险在一定范围内仍然存在。外国投资者对印度投资,有必要采取适当的法律应对措施,以防止和消除政治风险,进而确保投资利润和目标的实现。  相似文献   
82.
There are numerous ways of displaying Likert‐type scales but only a few investigators have investigated these differences systematically. In this study we report the results that we found when we compared four different layouts: scales that went numerically from ‘0’ to ‘10’, or from ‘10’ to ‘0’, and scales that went verbally from ‘clear’ to ‘unclear’, or ‘unclear’ to ‘clear’. Over 450 participants rated each of seven aspects of a structured abstract in a web‐based study, with each one using only one of the four scale formats listed above. The resulting data showed that the scale ‘Clear – 10 … 0 – Unclear’ consistently led to significantly higher ratings in all seven cases. Such findings have implications for the design of Likert‐type scales and for the data that are gathered from them.  相似文献   
83.
The purpose of this article is to strengthen the understanding of the relationship between a fixed-blocks and random-blocks analysis in models that do not include interactions between treatments and blocks. Treating the block effects as random has been recommended in the literature for balanced incomplete block designs (BIBD) because it results in smaller variances of treatment contrasts. This reduction in variance is large if the block-to-block variation relative to the total variation is small. However, this analysis is also more complicated because it results in a subjective interpretation of results if the block variance component is non-positive. The probability of a non-positive variance component is large precisely in those situations where a random-blocks analysis is useful – that is, when the block-to-block variation, relative to the total variation, is small. In contrast, the analysis in which the block effects are fixed is computationally simpler and less subjective. The loss in power for some BIBD with a fixed effects analysis is trivial. In such cases, we recommend treating the block effects as fixed. For response surface experiments designed in blocks, however, an opposite recommendation is made. When block effects are fixed, the variance of the estimated response surface is not uniquely estimated, and in practice this variance is obtained by ignoring the block effect. It is argued that a more reasonable approach is to treat the block effects to be random than to ignore it.  相似文献   
84.

The reason for considering the quick response production strategy to market demand is due to the rapid technology change, which results in decreasing market price and obsolescence. This study considers a production strategy of locating final production line in response to the changes in market demand and the continuous deterioration in stock. The demand rate is assumed to decrease exponentially with time while the price is assumed to decrease linearly with time. The purpose of this study is to derive the most economical site of final-production line that assembles products with short life cycle. The model considered in this research takes into account the sales revenue, the deteriorating cost, the carrying cost, the variable cost and the fixed cost of production. Although there is a higher labour and material cost when the production site is located near market point, the total profit increases due to quicker responsive time, smaller import tax, lower inventory and lesser deteriorating cost.  相似文献   
85.
Centering around anticipative and reactive capabilities of firms, accurate response is an important supply‐side strategy to deal with demand uncertainty. Clearly, the structure of the possible reaction will crucially influence the optimal anticipative decision making. In this article, we extend the existing literature in this area by including a new reactive capability, namely the utilization of refurbished consumer returns from early sales to react to demand later in the selling season. Because consumer returns depend on previous sales, there is also a direct link to the anticipative supply decision. We capture this effect in a newsvendor‐type model and provide both analytical and numerical insights into the optimal anticipative and reactive decisions as well as the value of refurbishing in terms of the retailer's expected profitability.  相似文献   
86.
This paper provides a novel approach to ordering signals based on the property that more informative signals lead to greater variability of conditional expectations. We define two nested information criteria (supermodular precision and integral precision) by combining this approach with two variability orders (dispersive and convex orders). We relate precision criteria with orderings based on the value of information to a decision maker. We then use precision to study the incentives of an auctioneer to supply private information. Using integral precision, we obtain two results: (i) a more precise signal yields a more efficient allocation; (ii) the auctioneer provides less than the efficient level of information. Supermodular precision allows us to extend the previous analysis to the case in which supplying information is costly and to obtain an additional finding; (iii) there is a complementarity between information and competition, so that both the socially efficient and the auctioneer's optimal choice of precision increase with the number of bidders.  相似文献   
87.
We analyze the benefits of inventory pooling in a multi‐location newsvendor framework. Using a number of common demand distributions, as well as the distribution‐free approximation, we compare the centralized (pooled) system with the decentralized (non‐pooled) system. We investigate the sensitivity of the absolute and relative reduction in costs to the variability of demand and to the number of locations (facilities) being pooled. We show that for the distributions considered, the absolute benefit of risk pooling increases with variability, and the relative benefit stays fairly constant, as long as the coefficient of variation of demand stays in the low range. However, under high‐variability conditions, both measures decrease to zero as the demand variability is increased. We show, through analytical results and computational experiments, that these effects are due to the different operating regimes exhibited by the system under different levels of variability: as the variability is increased, the system switches from the normal operation to the effective and then complete shutdown regimes; the decrease in the benefits of risk pooling is associated with the two latter stages. The centralization allows the system to remain in the normal operation regime under higher levels of variability compared to the decentralized system.  相似文献   
88.
One of the principal sources of error in data collected from structured face-to-face interviews is the interviewer. The other major component of imprecision in survey estimates is sampling variance. It is rare, however, to find studies in which the complex sampling variance and the complex interviewer variance are both computed. This paper compares the relative impact of interviewer effects and sample design effects on survey precision by making use of an interpenetrated primary sampling unit–interviewer experiment which was designed by the authors for implementation in the second wave of the British Household Panel Study as part of its scientific programme. It also illustrates the use of a multilevel (hierarchical) approach in which the interviewer and sample design effects are estimated simultaneously while being incorporated in a substantive model of interest.  相似文献   
89.
An exponentially weighted moving average (EWMA) control chart of squared distance is developed by means of a double EWMA approach to monitor process dispersion with individual measurements distributed within the class of elliptically symmetric distributions. Several examples highlighting possible extensions of the control chart to multivariate processes are provided. In particular, for multivariate normal processes, an investigation on the detection power of the chart is carried out through Monte Carlo studies. The results show that the proposed control chart performs well, especially when a process has a small or moderate shift.  相似文献   
90.
Adam M. Finkel 《Risk analysis》2014,34(10):1785-1794
If exposed to an identical concentration of a carcinogen, every human being would face a different level of risk, determined by his or her genetic, environmental, medical, and other uniquely individual characteristics. Various lines of evidence indicate that this susceptibility variable is distributed rather broadly in the human population, with perhaps a factor of 25‐ to 50‐fold between the center of this distribution and either of its tails, but cancer risk assessment at the EPA and elsewhere has always treated every (adult) human as identically susceptible. The National Academy of Sciences “Silver Book” concluded that EPA and the other agencies should fundamentally correct their mis‐computation of carcinogenic risk in two ways: (1) adjust individual risk estimates upward to provide information about the upper tail; and (2) adjust population risk estimates upward (by about sevenfold) to correct an underestimation due to a mathematical property of the interindividual distribution of human susceptibility, in which the susceptibility averaged over the entire (right‐skewed) population exceeds the median value for the typical human. In this issue of Risk Analysis, Kenneth Bogen disputes the second adjustment and endorses the first, though he also relegates the problem of underestimated individual risks to the realm of “equity concerns” that he says should have little if any bearing on risk management policy. In this article, I show why the basis for the population risk adjustment that the NAS recommended is correct—that current population cancer risk estimates, whether they are derived from animal bioassays or from human epidemiologic studies, likely provide estimates of the median with respect to human variation, which in turn must be an underestimate of the mean. If cancer risk estimates have larger “conservative” biases embedded in them, a premise I have disputed in many previous writings, such a defect would not excuse ignoring this additional bias in the direction of underestimation. I also demonstrate that sensible, legally appropriate, and ethical risk policy must not only inform the public when the tail of the individual risk distribution extends into the “high‐risk” range, but must alter benefit‐cost balancing to account for the need to try to reduce these tail risks preferentially.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号