首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   23606篇
  免费   272篇
  国内免费   15篇
管理学   3126篇
民族学   170篇
人才学   1篇
人口学   3491篇
丛书文集   143篇
理论方法论   1708篇
综合类   976篇
社会学   11348篇
统计学   2930篇
  2023年   74篇
  2022年   64篇
  2021年   111篇
  2020年   214篇
  2019年   268篇
  2018年   1957篇
  2017年   2103篇
  2016年   1375篇
  2015年   299篇
  2014年   349篇
  2013年   2103篇
  2012年   822篇
  2011年   1586篇
  2010年   1410篇
  2009年   1100篇
  2008年   1213篇
  2007年   1399篇
  2006年   417篇
  2005年   581篇
  2004年   542篇
  2003年   476篇
  2002年   355篇
  2001年   348篇
  2000年   292篇
  1999年   281篇
  1998年   203篇
  1997年   170篇
  1996年   186篇
  1995年   176篇
  1994年   159篇
  1993年   162篇
  1992年   185篇
  1991年   186篇
  1990年   183篇
  1989年   159篇
  1988年   179篇
  1987年   192篇
  1986年   147篇
  1985年   158篇
  1984年   175篇
  1983年   151篇
  1982年   151篇
  1981年   121篇
  1980年   111篇
  1979年   123篇
  1978年   95篇
  1977年   79篇
  1976年   93篇
  1975年   109篇
  1974年   84篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
991.
Recently, the concept of black swans has gained increased attention in the fields of risk assessment and risk management. Different types of black swans have been suggested, distinguishing between unknown unknowns (nothing in the past can convincingly point to its occurrence), unknown knowns (known to some, but not to relevant analysts), or known knowns where the probability of occurrence is judged as negligible. Traditional risk assessments have been questioned, as their standard probabilistic methods may not be capable of predicting or even identifying these rare and extreme events, thus creating a source of possible black swans. In this article, we show how a simulation model can be used to identify previously unknown potentially extreme events that if not identified and treated could occur as black swans. We show that by manipulating a verified and validated model used to predict the impacts of hazards on a system of interest, we can identify hazard conditions not previously experienced that could lead to impacts much larger than any previous level of impact. This makes these potential black swan events known and allows risk managers to more fully consider them. We demonstrate this method using a model developed to evaluate the effect of hurricanes on energy systems in the United States; we identify hurricanes with potentially extreme impacts, storms well beyond what the historic record suggests is possible in terms of impacts.  相似文献   
992.
Job insecurity has received growing attention from researchers because it poses serious challenges for organisations and for society as a whole. However, there are insufficient studies about the processes through which job insecurity affects outcomes as well as potential ways to reduce its negative impact. This study focuses on the relationship between job insecurity and individual-level outcomes (in-role performance and organisational deviance) and examines if (a) job insecurity is positively and/or negatively related to work outcomes, (b) psychological contract breach acts as a mediator of the relationship between job insecurity and work outcomes, and (c) positive psychological capital (PsyCap) buffers the job insecurity–work outcomes relationship via psychological contract breach. With a sample of 362 employee–supervisor dyads, in which the outcome measures were collected from the supervisors, we found support for our hypotheses. Specifically, we found a moderated mediation effect, whereby PsyCap moderates the negative indirect relationship of job insecurity on outcomes through psychological contract breach.  相似文献   
993.
Since the Last Planner System® (LPS) was devised in the early 90s, a number of studies have pointed out the need to understand the underlying theory in which it is based on. The Language-Action Perspective (LAP) has been suggested as a suitable approach to understand the management of commitments in the LPS. This paper aims to assess the contribution of LAP to understand construction planning and control systems based on LPS. Two case studies were carried out in different construction companies, both highly experienced on the use of LPS. The results reveal the role of LAP for creating explicit representations of commitment flows that can be used to explain the sources of complexity and failures in planning systems, as well as for describing the profile of planning and control meetings.  相似文献   
994.
995.
In make-to order production, schedule reliability is very important but still not sufficiently accomplished in industrial practice by the vast majority of companies. It has been known for long that processing the orders at a workstation in the order of their operation due-dates can compensate for lateness in the arrival at the workstation within certain boundaries. The paper analyses the effectiveness of earliest operation due date (EODD) sequencing by comparing it to an optimistic theoretical boundary. The surprising result is that EODD can nearly fully exploit the theoretical potential. It should therefore be used in practice whenever schedule reliability is important, with only few exceptions. Its effectiveness though is increasing with the workstation’s WIP level and thus is in conflict with the objective to reduce WIP levels and throughput times. A simple forecasting model allows to assess the extent to which lateness can be compensated by EODD sequencing and which schedule compliance can be achieved.  相似文献   
996.
The Kidney Exchange Problem (KEP) is a combinatorial optimization problem and has attracted the attention from the community of integer programming/combinatorial optimisation in the past few years. Defined on a directed graph, the KEP has two variations: one concerns cycles only, and the other, cycles as well as chains on the same graph. We call the former a Cardinality Constrained Multi-cycle Problem (CCMcP) and the latter a Cardinality Constrained Cycles and Chains Problem (CCCCP). The cardinality for cycles is restricted in both CCMcP and CCCCP. As for chains, some studies in the literature considered cardinality restrictions, whereas others did not. The CCMcP can be viewed as an Asymmetric Travelling Salesman Problem that does allow subtours, however these subtours are constrained by cardinality, and that it is not necessary to visit all vertices. In existing literature of the KEP, the cardinality constraint for cycles is usually considered to be small (to the best of our knowledge, no more than six). In a CCCCP, each vertex on the directed graph can be included in at most one cycle or chain, but not both. The CCMcP and the CCCCP are interesting and challenging combinatorial optimization problems in their own rights, particularly due to their similarities to some travelling salesman- and vehicle routing-family of problems. In this paper, our main focus is to review the existing mathematical programming models and solution methods in the literature, analyse the performance of these models, and identify future research directions. Further, we propose a polynomial-sized and an exponential-sized mixed-integer linear programming model, discuss a number of stronger constraints for cardinality-infeasible-cycle elimination for the latter, and present some preliminary numerical results.  相似文献   
997.
In several areas like global optimization using branch-and-bound methods for mixture design, the unit n-simplex is refined by longest edge bisection (LEB). This process provides a binary search tree. For \(n>2\), simplices appearing during the refinement process can have more than one longest edge (LE). The size of the resulting binary tree depends on the specific sequence of bisected longest edges. The questions are how to calculate the size of one of the smallest binary trees generated by LEB and how to find the corresponding sequence of LEs to bisect, which can be represented by a set of LE indices. Algorithms answering these questions are presented here. We focus on sets of LE indices that are repeated at a level of the binary tree. A set of LEs was presented in Aparicio et al. (Informatica 26(1):17–32, 2015), for \(n=3\). An additional question is whether this set is the best one under the so-called \(m_k\)-valid condition.  相似文献   
998.
Neighbourly set of a graph is a subset of edges which either share an end point or are joined by an edge of that graph. The maximum cardinality neighbourly set problem is known to be NP-complete for general graphs. Mahdian (Discret Appl Math 118:239–248, 2002) proved that it is in polynomial time for quadrilateral-free graphs and proposed an \(O(n^{11})\) algorithm for the same, here n is the number of vertices in the graph, (along with a note that by a straightforward but lengthy argument it can be proved to be solvable in \(O(n^5)\) running time). In this paper we propose an \(O(n^2)\) time algorithm for finding a maximum cardinality neighbourly set in a quadrilateral-free graph.  相似文献   
999.
The linear sum assignment problem is a fundamental combinatorial optimisation problem and can be broadly defined as: given an \(n \times m, m \ge n\) benefit matrix \(B = (b_{ij})\), matching each row to a different column so that the sum of entries at the row-column intersections is maximised. This paper describes the application of a new fast heuristic algorithm, Asymmetric Greedy Search, to the asymmetric version (\(n \ne m\)) of the linear sum assignment problem. Extensive computational experiments, using a range of model graphs demonstrate the effectiveness of the algorithm. The heuristic was also incorporated within an algorithm for the non-sequential protein structure matching problem where non-sequential alignment between two proteins, normally of different numbers of amino acids, needs to be maximised.  相似文献   
1000.
MapReduce system is a popular big data processing framework, and the performance of it is closely related to the efficiency of the centralized scheduler. In practice, the centralized scheduler often has little information in advance, which means each job may be known only after being released. In this paper, hence, we consider the online MapReduce scheduling problem of minimizing the makespan, where jobs are released over time. Both preemptive and non-preemptive version of the problem are considered. In addition, we assume that reduce tasks cannot be parallelized because they are often complex and hard to be decomposed. For the non-preemptive version, we prove the lower bound is \(\frac{m+m(\Psi (m)-\Psi (k))}{k+m(\Psi (m)-\Psi (k))}\), higher than the basic online machine scheduling problem, where k is the root of the equation \(k=\big \lfloor {\frac{m-k}{1+\Psi (m)-\Psi (k)}+1 }\big \rfloor \) and m is the quantity of machines. Then we devise an \((2-\frac{1}{m})\)-competitive online algorithm called MF-LPT (Map First-Longest Processing Time) based on the LPT. For the preemptive version, we present a 1-competitive algorithm for two machines.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号