首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   359篇
  免费   6篇
管理学   64篇
民族学   2篇
人口学   35篇
丛书文集   1篇
理论方法论   19篇
综合类   4篇
社会学   164篇
统计学   76篇
  2023年   4篇
  2022年   4篇
  2021年   6篇
  2020年   6篇
  2019年   9篇
  2018年   20篇
  2017年   17篇
  2016年   18篇
  2015年   5篇
  2014年   8篇
  2013年   74篇
  2012年   25篇
  2011年   12篇
  2010年   8篇
  2009年   6篇
  2008年   15篇
  2007年   13篇
  2006年   7篇
  2005年   7篇
  2004年   6篇
  2003年   5篇
  2002年   4篇
  2000年   7篇
  1999年   7篇
  1998年   6篇
  1997年   6篇
  1996年   3篇
  1994年   2篇
  1993年   6篇
  1992年   3篇
  1991年   4篇
  1990年   2篇
  1989年   4篇
  1988年   2篇
  1987年   4篇
  1986年   2篇
  1985年   3篇
  1983年   2篇
  1982年   1篇
  1981年   3篇
  1980年   4篇
  1979年   2篇
  1977年   1篇
  1976年   2篇
  1975年   1篇
  1974年   2篇
  1973年   2篇
  1972年   1篇
  1970年   1篇
  1968年   2篇
排序方式: 共有365条查询结果,搜索用时 0 毫秒
101.
This paper describes the research, analysis and development of a model clarifying the similarities and differences in competencies and personality factors 1 associated with effective Leadership and Management in the Royal Navy. A questionnaire study was conducted on a sample of 261 Officers and Ratings (Sailors). Their performance was rated through the organization's rigorous appraisal process, whilst competency and personality data were gathered through the Occupational Personality Questionnaire and the Leadership Dimensions Questionnaire. The results identify the common and unique relevance of specific competencies and personality factors and so provide an illuminating insight into the differences between the constructs of leadership and management. The critical factors related to effective leadership and management performance are also identified.  相似文献   
102.
103.
We study inference in structural models with a jump in the conditional density, where location and size of the jump are described by regression curves. Two prominent examples are auction models, where the bid density jumps from zero to a positive value at the lowest cost, and equilibrium job‐search models, where the wage density jumps from one positive level to another at the reservation wage. General inference in such models remained a long‐standing, unresolved problem, primarily due to nonregularities and computational difficulties caused by discontinuous likelihood functions. This paper develops likelihood‐based estimation and inference methods for these models, focusing on optimal (Bayes) and maximum likelihood procedures. We derive convergence rates and distribution theory, and develop Bayes and Wald inference. We show that Bayes estimators and confidence intervals are attractive both theoretically and computationally, and that Bayes confidence intervals, based on posterior quantiles, provide a valid large sample inference method.  相似文献   
104.
This paper addresses the performance of scheduling algorithms for a two-stage no-wait hybrid flowshop environment with inter-stage flexibility, where there exist several parallel machines at each stage. Each job, composed of two operations, must be processed from start to completion without any interruption either on or between the two stages. For each job, the total processing time of its two operations is fixed, and the stage-1 operation is divided into two sub-parts: an obligatory part and an optional part (which is to be determined by a solution), with a constraint that no optional part of a job can be processed in parallel with an idleness of any stage-2 machine. The objective is to minimize the makespan. We prove that even for the special case with only one machine at each stage, this problem is strongly NP-hard. For the case with one machine at stage 1 and m machines at stage 2, we propose two polynomial time approximation algorithms with worst case ratio of \(3-\frac{2}{m+1}\) and \(2-\frac{1}{m+1}\), respectively. For the case with m machines at stage 1 and one machine at stage 2, we propose a polynomial time approximation algorithm with worst case ratio of 2. We also prove that all the worst case ratios are tight.  相似文献   
105.
Barrier coverage, as one of the most important applications of wireless sensor network (WSNs), is to provide coverage for the boundary of a target region. We study the barrier coverage problem by using a set of n sensors with adjustable coverage radii deployed along a line interval or circle. Our goal is to determine a range assignment \(\mathbf {R}=({r_{1}},{r_{2}}, \ldots , {r_{n}})\) of sensors such that the line interval or circle is fully covered and its total cost \(C(\mathbf {R})=\sum _{i=1}^n {r_{i}}^\alpha \) is minimized. For the line interval case, we formulate the barrier coverage problem of line-based offsets deployment, and present two approximation algorithms to solve it. One is an approximation algorithm of ratio 4 / 3 runs in \(O(n^{2})\) time, while the other is a fully polynomial time approximation scheme (FPTAS) of computational complexity \(O(\frac{n^{2}}{\epsilon })\). For the circle case, we optimally solve it when \(\alpha = 1\) and present a \(2(\frac{\pi }{2})^\alpha \)-approximation algorithm when \(\alpha > 1\). Besides, we propose an integer linear programming (ILP) to minimize the total cost of the barrier coverage problem such that each point of the line interval is covered by at least k sensors.  相似文献   
106.
107.
This symposium describes collaborative research on neuroergonomics, technology, and cognition being conducted at George Mason University and the US Air Force Research Laboratory (AFRL) as part of the Center of Excellence in Neuroergonomics, Technology, and Cognition (CENTEC). Six presentations describe the latest developments in neuroergonomics research conducted by CENTEC scientists. The individual papers cover studies of: (1) adaptive learning systems; (2) neurobehavioral synchronicity during team performance; (3) genetics and individual differences in decision making; (4) vigilance and mindlessness; (5) interruptions and multi-tasking; and (6) development of a simulation capability that integrates measures across these domains and levels of analysis.  相似文献   
108.
Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.  相似文献   
109.
The main objective of this work is to propose a method and a tool to support the development of indicators able to inform an organization about the state of its resilience through a cyclical process of identifying its resilience factors, proposing resilience indicators, assessing its organizational resilience followed by assessing and improving the resilience indicators. The research uses concepts from complex adaptive systems and from resilience engineering to establish an initial set of indicators able to assess elements that contribute to organizational resilience, and structures them temporarily as a hierarchy. A software application to support indicator definition and structuring, questionnaire generation, and result assessment activities was built to assist in speeding up the experiment-adjust cycle. Prototype indicators were instantiated with helicopter operating companies in mind, and were reviewed by a domain expert.  相似文献   
110.
The main objective of this paper is to develop a full Bayesian analysis for the Birnbaum–Saunders (BS) regression model based on scale mixtures of the normal (SMN) distribution with right-censored survival data. The BS distributions based on SMN models are a very general approach for analysing lifetime data, which has as special cases the Student-t-BS, slash-BS and the contaminated normal-BS distributions, being a flexible alternative to the use of the corresponding BS distribution or any other well-known compatible model, such as the log-normal distribution. A Gibbs sample algorithm with Metropolis–Hastings algorithm is used to obtain the Bayesian estimates of the parameters. Moreover, some discussions on the model selection to compare the fitted models are given and case-deletion influence diagnostics are developed for the joint posterior distribution based on the Kullback–Leibler divergence. The newly developed procedures are illustrated on a real data set previously analysed under BS regression models.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号