首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   280篇
  免费   8篇
管理学   71篇
民族学   4篇
人口学   17篇
丛书文集   3篇
理论方法论   47篇
综合类   3篇
社会学   99篇
统计学   44篇
  2023年   1篇
  2022年   1篇
  2021年   3篇
  2020年   1篇
  2019年   9篇
  2018年   4篇
  2017年   16篇
  2016年   8篇
  2015年   7篇
  2014年   11篇
  2013年   40篇
  2012年   19篇
  2011年   16篇
  2010年   17篇
  2009年   7篇
  2008年   14篇
  2007年   13篇
  2006年   12篇
  2005年   14篇
  2004年   10篇
  2003年   4篇
  2002年   5篇
  2001年   11篇
  2000年   6篇
  1999年   3篇
  1998年   2篇
  1997年   3篇
  1996年   2篇
  1995年   2篇
  1994年   3篇
  1992年   2篇
  1991年   3篇
  1990年   1篇
  1989年   1篇
  1988年   1篇
  1987年   1篇
  1984年   1篇
  1982年   2篇
  1981年   2篇
  1980年   1篇
  1979年   1篇
  1978年   1篇
  1975年   1篇
  1973年   1篇
  1972年   2篇
  1971年   1篇
  1968年   1篇
  1967年   1篇
排序方式: 共有288条查询结果,搜索用时 640 毫秒
1.
The last observation carried forward (LOCF) approach is commonly utilized to handle missing values in the primary analysis of clinical trials. However, recent evidence suggests that likelihood‐based analyses developed under the missing at random (MAR) framework are sensible alternatives. The objective of this study was to assess the Type I error rates from a likelihood‐based MAR approach – mixed‐model repeated measures (MMRM) – compared with LOCF when estimating treatment contrasts for mean change from baseline to endpoint (Δ). Data emulating neuropsychiatric clinical trials were simulated in a 4 × 4 factorial arrangement of scenarios, using four patterns of mean changes over time and four strategies for deleting data to generate subject dropout via an MAR mechanism. In data with no dropout, estimates of Δ and SEΔ from MMRM and LOCF were identical. In data with dropout, the Type I error rates (averaged across all scenarios) for MMRM and LOCF were 5.49% and 16.76%, respectively. In 11 of the 16 scenarios, the Type I error rate from MMRM was at least 1.00% closer to the expected rate of 5.00% than the corresponding rate from LOCF. In no scenario did LOCF yield a Type I error rate that was at least 1.00% closer to the expected rate than the corresponding rate from MMRM. The average estimate of SEΔ from MMRM was greater in data with dropout than in complete data, whereas the average estimate of SEΔ from LOCF was smaller in data with dropout than in complete data, suggesting that standard errors from MMRM better reflected the uncertainty in the data. The results from this investigation support those from previous studies, which found that MMRM provided reasonable control of Type I error even in the presence of MNAR missingness. No universally best approach to analysis of longitudinal data exists. However, likelihood‐based MAR approaches have been shown to perform well in a variety of situations and are a sensible alternative to the LOCF approach. MNAR methods can be used within a sensitivity analysis framework to test the potential presence and impact of MNAR data, thereby assessing robustness of results from an MAR method. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   
2.
Economists have become increasingly interested in hypotheses from sociobiology as a source of inspiration for filling gaps in the economic model of behavior. To avoid borrowing eclectically and arbitrarily from neighboring disciplines, this paper attempts to outline in a systematic way the similarities and differences between the approaches taken in economics and sociobiology. In doing so, special attention is given to an empirical theory of preferences that is lacking in economics. Here, inspiration from sociobiology would seem to be particularly useful. The considerations in the paper suggest that sociobiological arguments may indeed be helpful, albeit at a very elementary level only. A more comprehensive theory cannot ignore the influences of innate learning mechanisms in higher living beings. An elaborated theory of preferences in economics will have to acknowledge and incorporate insights from behavioral psychology.  相似文献   
3.
4.
5.
The literature reports some contradictory results on the degree of phonological specificity of infants’ early lexical representations in the Romance language, French, and Germanic languages. It is not clear whether these discrepancies are because of differences in method, in language characteristics, or in participants’ age. In this study, we examined whether 12‐ and 17‐month‐old French‐speaking infants are able to distinguish well‐pronounced from mispronounced words (one or two features of their initial consonant). To this end, 46 infants participated in a preferential looking experiment in which they were presented with pairs of pictures together with a spoken word well pronounced or mispronounced. The results show that both 12‐ and 17‐month‐old infants look longer at the pictures corresponding to well‐pronounced words than to mispronounced words, but show no difference between the two mispronunciation types. These results suggest that, as early as 12 months, French‐speaking infants, like those exposed to Germanic languages, already possess detailed phonological representations of familiar words.  相似文献   
6.
This paper proposes a hierarchical probabilistic model for ordinal matrix factorization. Unlike previous approaches, we model the ordinal nature of the data and take a principled approach to incorporating priors for the hidden variables. Two algorithms are presented for inference, one based on Gibbs sampling and one based on variational Bayes. Importantly, these algorithms may be implemented in the factorization of very large matrices with missing entries.  相似文献   
7.
The Global Justice Movements emerged in the context of the contradictions and crisis of neoliberal–imperial globalization and the critique of it. They therefore express and provide a basis for the politicization of the negative consequences of post-Fordism and its crisis. This article examines the structural changes of the last 30 years from a Gramscian perspective of neoliberal globalization as a “passive revolution” and as the deepening of a “imperial mode of living” at a global scale. It is argued that examining structural changes helps us to understand why protest and social movements re-emerged around the year 2000. The article discusses some central features of the Global Justice Movements by focusing on the international Attac movement and the recent Occupy movement.  相似文献   
8.
Mark J. Kaiser 《Risk analysis》2015,35(8):1562-1590
Public companies in the United States are required to report standardized values of their proved reserves and asset retirement obligations on an annual basis. When compared, these two measures provide an aggregate indicator of corporate decommissioning risk but, because of their consolidated nature, cannot readily be decomposed at a more granular level. The purpose of this article is to introduce a decommissioning risk metric defined in terms of the ratio of the expected value of an asset's reserves to its expected cost of decommissioning. Asset decommissioning risk (ADR) is more difficult to compute than a consolidated corporate risk measure, but can be used to quantify the decommissioning risk of structures and to perform regional comparisons, and also provides market signals of future decommissioning activity. We formalize two risk metrics for decommissioning and apply the ADR metric to the deepwater Gulf of Mexico (GOM) floater inventory. Deepwater oil and gas structures are expensive to construct, and at the end of their useful life, will be expensive to decommission. The value of proved reserves for the 42 floating structures in the GOM circa January 2013 is estimated to range between $37 and $80 billion for future oil prices between 60 and 120 $/bbl, which is about 10 to 20 times greater than the estimated $4.3 billion to decommission the inventory. Eni's Allegheny and MC Offshore's Jolliet tension leg platforms have ADR metrics less than one and are approaching the end of their useful life. Application of the proposed metrics in the regulatory review of supplemental bonding requirements in the U.S. Outer Continental Shelf is suggested to complement the current suite of financial metrics employed.  相似文献   
9.
The main purpose of dose‐escalation trials is to identify the dose(s) that is/are safe and efficacious for further investigations in later studies. In this paper, we introduce dose‐escalation designs that incorporate both the dose‐limiting events and dose‐limiting toxicities (DLTs) and indicative responses of efficacy into the procedure. A flexible nonparametric model is used for modelling the continuous efficacy responses while a logistic model is used for the binary DLTs. Escalation decisions are based on the combination of the probabilities of DLTs and expected efficacy through a gain function. On the basis of this setup, we then introduce 2 types of Bayesian adaptive dose‐escalation strategies. The first type of procedures, called “single objective,” aims to identify and recommend a single dose, either the maximum tolerated dose, the highest dose that is considered as safe, or the optimal dose, a safe dose that gives optimum benefit risk. The second type, called “dual objective,” aims to jointly estimate both the maximum tolerated dose and the optimal dose accurately. The recommended doses obtained under these dose‐escalation procedures provide information about the safety and efficacy profile of the novel drug to facilitate later studies. We evaluate different strategies via simulations based on an example constructed from a real trial on patients with type 2 diabetes, and the use of stopping rules is assessed. We find that the nonparametric model estimates the efficacy responses well for different underlying true shapes. The dual‐objective designs give better results in terms of identifying the 2 real target doses compared to the single‐objective designs.  相似文献   
10.
Abstract. There are two economic reasons for supporting the Internal Market Programme of the EC by social policies: the fust argument refers to welfare theory, the second to policies of distribution. First, it could be possible that an economic integration without social completion will lead to welfare losses. Therefore, it could be necessary to correct market forces or to support them respectively because, due to market imperfections, they do not provide the best possible use of productive facilities. Second, it cannot be excluded that level and structure of social provisions will not be accepted because of superior aspects of policies of distribution. Market results could miss the aim of adjusting life and working conditions within the EC and developing them further. These two arguments mentioned above should be followed up in the discussion of the pros and cons of a harmonization of social systems in the EC. For the purposes of this paper these. rather fundamental considerations are applied to the following concrete items: dismissal protection, non-standard forms of employment, and working hours. These regulations influence numerical flexibility of enterprises, i.e. the possibilities of quantitative adjustments in staff use. The steadily growing competition in Europe will also increase the importance of flexibility potentials of enterprises in the different countries as a relevant factor for enterprise location. Extensive flexibility restrictions could prove to be competitive disadvantage in view of countries with more possibilities of flexibilization.Therefore, after the comparison of flexibility potentials in the EC-countries the implications for European laws (key-word: Social Charter) are to be discussed.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号