首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   113篇
  免费   3篇
管理学   20篇
民族学   1篇
人口学   22篇
理论方法论   5篇
社会学   21篇
统计学   47篇
  2023年   1篇
  2022年   3篇
  2021年   1篇
  2020年   3篇
  2019年   8篇
  2018年   5篇
  2017年   8篇
  2016年   6篇
  2015年   2篇
  2014年   2篇
  2013年   21篇
  2012年   7篇
  2011年   3篇
  2010年   7篇
  2009年   10篇
  2008年   3篇
  2007年   2篇
  2006年   1篇
  2005年   2篇
  2004年   4篇
  2003年   1篇
  2002年   1篇
  2001年   2篇
  1999年   3篇
  1996年   1篇
  1995年   1篇
  1992年   2篇
  1988年   1篇
  1985年   1篇
  1977年   1篇
  1976年   1篇
  1975年   2篇
排序方式: 共有116条查询结果,搜索用时 31 毫秒
21.
In two recent papers, Lozano and Villa [Centralized resource allocation using data envelopment analysis. Journal of Productivity Analysis 2004;22:143–61. [1]] and Lozano et al. [Centralized target setting for regional recycling operations using DEA. OMEGA 2004;32:101–10. [2]] introduce the concept of “centralized” data envelopment analysis (DEA) models, which aim at optimizing the combined resource consumption by all units in an organization rather than considering the consumption by each unit separately. This is particularly relevant for situations where some variables are controlled by a central authority (e.g. Head Office) rather than individual unit managers. In this paper we reconsider one of the centralized models proposed by the above-mentioned authors and suggest modifying it to only consider adjustments of previously inefficient units. We show how this new model formulation relate to a standard DEA model, namely as the analysis of the mean inefficient point. We also provide a procedure that can be used to generate alternative optimal solutions, enabling a decision maker to search through alternate solution possibilities in order to select the preferred one. We then extend the model to incorporate non-transferable as well as strictly non-discretionary variables and illustrate the models using an empirical example of a public service organization.  相似文献   
22.
23.
This paper models learning by experience beyond the experience curve, including the possibility of “learning to learn”: the pace of learning increases over time by building on what has already been learned. We compare the extended deterministic learning model with Jovanovic and Nyarkos' [26] stochastic learning. The theoretical models are tested with data on the total factor productivity of a car-assembly plant in its first months of operation. We find that the deterministic “mixed learning model”, where the speed of learning is equal to a constant plus a learning to learn effect, is the one that best fits the empirical data. The mixed learning model results in a time pattern of total factor productivity growth, first increasing and later decreasing, different from the always decreasing rate of growth of the learning curve, opening new perspectives on the study of learning by experience.  相似文献   
24.
The failure rate function commonly has a bathtub shape in practice. In this paper we discuss a regression model considering new Weibull extended distribution developed by Xie et al. (2002) that can be used to model this type of failure rate function. Assuming censored data, we discuss parameter estimation: maximum likelihood method and a Bayesian approach where Gibbs algorithms along with Metropolis steps are used to obtain the posterior summaries of interest. We derive the appropriate matrices for assessing the local influence on the parameter estimates under different perturbation schemes, and we also present some ways to perform global influence. Also, some discussions on case deletion influence diagnostics are developed for the joint posterior distribution based on the Kullback–Leibler divergence. Besides, for different parameter settings, sample sizes and censoring percentages, are performed various simulations and display and compare the empirical distribution of the Martingale-type residual with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to the martingale-type residual in log-Weibull extended models with censored data. Finally, we analyze a real data set under a log-Weibull extended regression model. We perform diagnostic analysis and model check based on the martingale-type residual to select an appropriate model.  相似文献   
25.
We contend that corruption must be detected as soon as possible so that corrective and preventive measures may be taken. Thus, we develop an early warning system based on a neural network approach, specifically self-organizing maps, to predict public corruption based on economic and political factors. Unlike previous research, which is based on the perception of corruption, we use data on actual cases of corruption. We apply the model to Spanish provinces in which actual cases of corruption were reported by the media or went to court between 2000 and 2012. We find that the taxation of real estate, economic growth, the increase in real estate prices, the growing number of deposit institutions and non-financial firms, and the same political party remaining in power for long periods seem to induce public corruption. Our model provides different profiles of corruption risk depending on the economic conditions of a region conditional on the timing of the prediction. Our model also provides different time frameworks to predict corruption up to 3 years before cases are detected.  相似文献   
26.
The main objective of this paper is to develop a full Bayesian analysis for the Birnbaum–Saunders (BS) regression model based on scale mixtures of the normal (SMN) distribution with right-censored survival data. The BS distributions based on SMN models are a very general approach for analysing lifetime data, which has as special cases the Student-t-BS, slash-BS and the contaminated normal-BS distributions, being a flexible alternative to the use of the corresponding BS distribution or any other well-known compatible model, such as the log-normal distribution. A Gibbs sample algorithm with Metropolis–Hastings algorithm is used to obtain the Bayesian estimates of the parameters. Moreover, some discussions on the model selection to compare the fitted models are given and case-deletion influence diagnostics are developed for the joint posterior distribution based on the Kullback–Leibler divergence. The newly developed procedures are illustrated on a real data set previously analysed under BS regression models.  相似文献   
27.
We consider a single product, single level, stochastic master production scheduling (Mps ) model where decisions are made under rolling planning horizons. Outcomes of interest are cost, service level, and schedule stability. The subject of this research is the Mps control system: the method used in determining the amount of stock planned for production in each time period. Typically, Mps control systems utilize a single buffer stock. Here, two Mps dual-buffer stock systems are developed and tested by simulation. We extend the data envelopment analysis (dea ) methodology to aid in the evaluation of the simulation results, where Dea serves to increase the scope of the experimental design. Results indicate that the dual-buffer control systems outperform existing policies.  相似文献   
28.
In this paper, the generalized log-gamma regression model is modified to allow the possibility that long-term survivors may be present in the data. This modification leads to a generalized log-gamma regression model with a cure rate, encompassing, as special cases, the log-exponential, log-Weibull and log-normal regression models with a cure rate typically used to model such data. The models attempt to simultaneously estimate the effects of explanatory variables on the timing acceleration/deceleration of a given event and the surviving fraction, that is, the proportion of the population for which the event never occurs. The normal curvatures of local influence are derived under some usual perturbation schemes and two martingale-type residuals are proposed to assess departures from the generalized log-gamma error assumption as well as to detect outlying observations. Finally, a data set from the medical area is analyzed.  相似文献   
29.
Measuring Quality of Life in Small Areas Over Different Periods of Time   总被引:1,自引:0,他引:1  
The purpose of this paper is todescribe an index methodology for measuringquality of life, understood multidimensionally,in a set of very different municipalities,units of measurement, and time periods.Although certain technical problems arise whensmall areas (municipalities) are considered,cross and serial comparison is completed. Themethodology was applied in 314 municipalitiesof the province of Barcelona (Spain).  相似文献   
30.
The effects of observing an adult emitting tacts on children’s rate of uninstructed (i.e., “spontaneous”) tacts were examined in three children diagnosed with autism. Each participant was exposed to two conditions in four settings each: in condition 1, participants received 20 trials of teacher-initiated interactions in which the child was asked to tact 20 objects during 5 min. Condition 2 was identical to condition 1 except that the teacher also tacted 20 objects interspersed with the 20 tact trials. The number of uninstructed tacts was recorded in both conditions. Children emitted between 1.58 and 2.68 times more uninstructed tacts in condition 2 than in condition 1. These results indicate that teachers’ emission of tacts increases the emission of uninstructed tacts in children with autism.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号