首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   19865篇
  免费   349篇
管理学   2394篇
民族学   109篇
人才学   8篇
人口学   1846篇
丛书文集   116篇
教育普及   2篇
理论方法论   1813篇
现状及发展   1篇
综合类   253篇
社会学   9950篇
统计学   3722篇
  2023年   98篇
  2021年   98篇
  2020年   307篇
  2019年   472篇
  2018年   478篇
  2017年   680篇
  2016年   474篇
  2015年   367篇
  2014年   458篇
  2013年   3333篇
  2012年   686篇
  2011年   608篇
  2010年   495篇
  2009年   460篇
  2008年   541篇
  2007年   530篇
  2006年   489篇
  2005年   451篇
  2004年   464篇
  2003年   418篇
  2002年   435篇
  2001年   485篇
  2000年   406篇
  1999年   406篇
  1998年   331篇
  1997年   297篇
  1996年   306篇
  1995年   291篇
  1994年   275篇
  1993年   267篇
  1992年   308篇
  1991年   306篇
  1990年   285篇
  1989年   280篇
  1988年   268篇
  1987年   245篇
  1986年   243篇
  1985年   263篇
  1984年   249篇
  1983年   255篇
  1982年   190篇
  1981年   178篇
  1980年   186篇
  1979年   190篇
  1978年   154篇
  1977年   153篇
  1976年   141篇
  1975年   131篇
  1974年   107篇
  1973年   80篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
911.
Biclustering consists in simultaneous partitioning of the set of samples and the set of their attributes (features) into subsets (classes). Samples and features classified together are supposed to have a high relevance to each other which can be observed by intensity of their expressions. We define the notion of consistency for biclustering using interrelation between centroids of sample and feature classes. We prove that consistent biclustering implies separability of the classes by convex cones. While previous works on biclustering concentrated on unsupervised learning and did not consider employing a training set, whose classification is given, we propose a model for supervised biclustering, whose consistency is achieved by feature selection. The developed model involves solution of a fractional 0–1 programming problem. Preliminary computational results on microarray data mining problems are reported.This research work was partially supported by NSF, NIH and AirForce grants.  相似文献   
912.
A New Look at the Psychometric Paradigm of Perception of Hazards   总被引:1,自引:0,他引:1  
The psychometric paradigm has been the most influential model in the field of risk analysis. The "cognitive maps" of hazards produced by the paradigm seem to explain how laypeople perceive the various risks they face. Because most of the studies used aggregated data, analyzed using principal component analysis, it is not known whether the model neglects individual differences in risk perception. There has been much criticism on the fact that few studies have examined individual differences in the cognitive representation of hazards. In order to detect and describe the internal structure of the three-way data, we conducted a three-way component analysis (3MPCA). Data for the present analysis were derived from a mail survey conducted in Switzerland. Participants were asked to judge 9 attributes for 26 hazards. Individual differences in the cognitive representation of hazards were correlated with external variables (e.g., general trust). The results suggest that methods permitting individual differences should be used more frequently and that utilizing different methods could provide greater insight into the cognitive representation of risks.  相似文献   
913.
We consider firms that feature their products on the Internet but take orders offline. Click and order data are disjoint on such non‐transactional websites, and their matching is error‐prone. Yet, their time separation may allow the firm to react and improve its tactical planning. We introduce a dynamic decision support model that augments the classic inventory planning model with additional clickstream state variables. Using a novel data set of matched online clickstream and offline purchasing data, we identify statistically significant clickstream variables and empirically investigate the value of clickstream tracking on non‐transactional websites to improve inventory management. We show that the noisy clickstream data is statistically significant to predict the propensity, amount, and timing of offline orders. A counterfactual analysis shows that using the demand information extracted from the clickstream data can reduce the inventory holding and backordering cost by 3% to 5% in our data set.  相似文献   
914.
This paper presents a theoretical model to help managers visualise and manage confidential situations more effectively. The model metaphorically likens a confidential setting to the properties of a soap bubble, e.g. elastic expansion or contraction, minimal surface area to contain a given volume, fragility, surface tension, pressure, stress, strain and the potential for bursting thereby releasing the contents to the external environment. We explore the conceptual developments in two phases. Firstly, looking at how a bubble and confidential scenario form and grow. Secondly, we consider how a bubble may burst and map these forces to ways in which confidentiality may be breached. Many attributes are mapped, the key ones being: the embedded value within the system, the criticality of maintaining confidentiality, increasing pressure, the corresponding stress/strain dynamics and the levels of trust between stakeholders. Key research propositions are derived from the model which aims to minimise the risk of a confidentiality breach.  相似文献   
915.
The university is a logical locus for discussionof the role race has played in our society. Perhaps noAmerican institution is more committed to free andopen dialogue than the university. Higher educationcan thus provide a context for the recognition ofissues as well as a forum for the resolution ofinitiatives. To date, however, university attempts atdiversity training have often imbued recipients withself-consciousness, usurping the unity implicit in theword `university' and evoking an even greater tendencytoward separatism. The university's traditional questfor truth has been subverted by a subtle and pervasivesense that some views are more correct than others,that openness is dangerous, and that some issues mighteven be taboo. At best, such an approach to diversityleads to a fragile stalemate among self-containedenclaves. By championing President Clinton's call fora dialogue on race, the university can restore itselfas an institution that puts honesty above all else.Not only is there a resonant rationale for theuniversity's central responsibility in this debate,but there is also a pedagogical means by which itsrole can be realized. This paper proposes a model forethnic dialogue relevant to either a text-based orissues-based class. Borrowing from pedagogy developedin professional schools, we believe that the tenets ofthe `case method' can create a climate conducive tothe substantive scrutiny of race, ethnicity, andprejudice in general. We argue that this dialogueshould not be a mere add-on to college life, butintegrated into existing curricula in the socialsciences, literature, and history. Heated debate canthen occur without anger, and race/ethnicity can bediscussed without fear of recrimination.Paradoxically, the very expression of ethnicity may bethe catalyst that eventually moves multiculturalismtoward interculturalism – where differences are nolonger articulated, measurable or even discernible.  相似文献   
916.
A risk assessment was conducted to determine the likelihood of certain health risks resulting from exposure to soils and food crops contaminated with polychlorinated biphenyl (PCBs). PCBs have contaminated soils, river sediments, and air in the past as a result of industrial activities at a capacitor plant located in the City of Serpukhov, Russian Federation. This risk assessment and suggestions for remediation are designed to aid in decision-making efforts by a joint Russian-American research team developing a community, national, and international response to industrial contamination. Bobovnikova et al. (The Science of the Total Environment 139/140, 357-364, [1993]) have reported that PCBs are elevated in soils and sediments, breast milk, and locally grown foods in the Serpukhov area. Data from these and other investigators have been used in this risk assessment to calculate a potential cancer risk resulting from exposure to PCBs. Our assessment indicates that members of the local population may be at increased risk of cancer, and possibly other adverse health effects, as a result of PCB contamination of their environment. Because previously unassessed environmental contamination is a common problem in the former Soviet Republics, as well as many other areas of the world, we believe this type of evaluation, using known methods, can serve as a model for assessment efforts in other parts of the globe and result in remediative efforts in regions constrained by faltering economies.  相似文献   
917.
Genetic differences (polymorphisms) among members of a population are thought to influence susceptibility to various environmental exposures. In practice, however, this information is rarely incorporated into quantitative risk assessment and risk management. We describe an analytic framework for predicting the risk reduction and value-of-information (VOI) resulting from specific risk management applications of genetic biomarkers, and we apply the framework to the example of occupational chronic beryllium disease (CBD), an immune-mediated pulmonary granulomatous disease. One described Human Leukocyte Antigen gene variant, HLA-DP beta 1*0201, contains a substitution of glutamate for lysine at position 69 that appears to have high sensitivity (approximately 94%) but low specificity (approximately 70%) with respect to CBD among individuals occupationally exposed to respirable beryllium. The expected postintervention CBD prevalence rates for using the genetic variant (1) as a required job placement screen, (2) as a medical screen for semiannual in place of annual lymphocyte proliferation testing, or (3) as a voluntary job placement screen are 0.08%, 0.8%, and 0.6%, respectively, in a hypothetical cohort with 1% baseline CBD prevalence. VOI analysis is used to examine the reduction in total social cost, calculated as the net value of disease reduction and financial expenditures, expected for proposed CBD intervention programs based on the genetic susceptibility test. For the example cohort, the expected net VOI per beryllium worker for genetically based testing and intervention is $13,000, $1,800, and $5,100, respectively, based on a health valuation of $1.45 million per CBD case avoided. VOI results for alternative CBD evaluations are also presented. Despite large parameter uncertainty, probabilistic analysis predicts generally positive utility for each of the three evaluated programs when avoidance of a CBD case is valued at $1 million or higher. Although the utility of a proposed risk management program may be evaluated solely in terms of risk reduction and financial costs, decisions about genetic testing and program implementation must also consider serious social, legal, and ethical factors.  相似文献   
918.
Driven by differing statutory mandates and programmatic separation of regulatory responsibilities between federal, state, and tribal agencies, distinct chemical and radiation risk management strategies have evolved. In the field this separation poses real challenges since many of the major environmental risk management decisions we face today require the evaluation of both types of risks. Over the last decade, federal, state, and tribal agencies have continued to discuss their different approaches and explore areas where their activities could be harmonized. The current framework for managing public exposures to chemical carcinogens has been referred to as a "bottom up approach." Risk between 10(-4) and 10(-6) is established as an upper bound goal. In contrast, a "top down" approach that sets an upper bound dose limit and couples with site specific As Low As Reasonably Achievable Principle (ALARA), is in place to manage individual exposure to radiation. While radiation risk are typically managed on a cumulative basis, exposure to chemicals is generally managed on a chemical-by-chemical, medium-by-medium basis. There are also differences in the nature and size of sites where chemical and radiation contamination is found. Such differences result in divergent management concerns. In spite of these differences, there are several common and practical concerns among radiation and chemical risk managers. They include 1) the issue of cost for site redevelopment and long-term stewardship, 2) public acceptance and involvement, and 3) the need for flexible risk management framework to address the first two issues. This article attempts to synthesize key differences, opportunities for harmonization, and challenges ahead.  相似文献   
919.
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location‐scale families (including the log‐normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications.  相似文献   
920.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号