全文获取类型
收费全文 | 410篇 |
免费 | 3篇 |
专业分类
管理学 | 60篇 |
人口学 | 7篇 |
丛书文集 | 4篇 |
理论方法论 | 11篇 |
综合类 | 43篇 |
社会学 | 4篇 |
统计学 | 284篇 |
出版年
2023年 | 1篇 |
2022年 | 3篇 |
2021年 | 4篇 |
2020年 | 3篇 |
2019年 | 7篇 |
2018年 | 11篇 |
2017年 | 16篇 |
2016年 | 10篇 |
2015年 | 1篇 |
2014年 | 15篇 |
2013年 | 111篇 |
2012年 | 31篇 |
2011年 | 15篇 |
2010年 | 12篇 |
2009年 | 15篇 |
2008年 | 19篇 |
2007年 | 14篇 |
2006年 | 6篇 |
2005年 | 11篇 |
2004年 | 4篇 |
2003年 | 11篇 |
2002年 | 6篇 |
2001年 | 5篇 |
2000年 | 4篇 |
1999年 | 2篇 |
1998年 | 5篇 |
1997年 | 4篇 |
1996年 | 3篇 |
1995年 | 4篇 |
1994年 | 3篇 |
1993年 | 1篇 |
1992年 | 3篇 |
1991年 | 6篇 |
1990年 | 4篇 |
1989年 | 8篇 |
1988年 | 4篇 |
1987年 | 1篇 |
1986年 | 2篇 |
1985年 | 5篇 |
1984年 | 5篇 |
1983年 | 3篇 |
1982年 | 6篇 |
1981年 | 3篇 |
1980年 | 2篇 |
1979年 | 1篇 |
1978年 | 3篇 |
排序方式: 共有413条查询结果,搜索用时 843 毫秒
51.
ABSTRACT This article presents a reliable method for highlighting a defective stage within a manufacturing process when the existence of a failure is only known at the end of the process. It was developed in the context of integrated circuit manufacturing, where low costs and high yields are indispensable if the manufacturer is to be competitive. Change detection methods were used to point out the defective stage. Two methods were compared and the best chosen. Thanks to this approach, it was possible to solve some yield problems for which the engineers' investigations were far from the real cause of failure. However, there is a strong requirement to assess the reliability of the suspicions cast on the incriminated stage, otherwise engineers could be made to do useless work and time could be wasted looking into events that are not the true cause of failure. Two complementary tools were implemented for this reliability assessment and their efficiency is illustrated by several examples. 相似文献
52.
We provide a method for finding the optimal double sampling plan for estimating the mean value of a continuous outcome. It is assumed that the fallible and true outcome data are related by a multivariate linear regression model where only some of the explanatory variables are sampled. Conditions under which double sampling is preferred over standard sampling plans are determined. An application of the method to a well-known data set on air pollution is presented. 相似文献
53.
A global measure of biomarker effectiveness is the Youden index, the maximum difference between sensitivity, the probability of correctly classifying diseased individuals, and 1-specificity, the probability of incorrectly classifying healthy individuals. The cut-point leading to the index is the optimal cut-point when equal weight is given to sensitivity and specificity. Using the delta method, we present approaches for estimating confidence intervals for the Youden index and corresponding optimal cut-point for normally distributed biomarkers and also those following gamma distributions. We also provide confidence intervals using various bootstrapping methods. A comparison of interval width and coverage probability is conducted through simulation over a variety of parametric situations. Confidence intervals via delta method are shown to have both closer to nominal coverage and shorter interval widths than confidence intervals from the bootstrapping methods. 相似文献
54.
Group testing has its origin in the identification of syphilis in the U.S. army during World War II. Much of the theoretical framework of group testing was developed starting in the late 1950s, with continued work into the 1990s. Recently, with the advent of new laboratory and genetic technologies, there has been an increasing interest in group testing designs for cost saving purposes. In this article, we compare different nested designs, including Dorfman, Sterrett and an optimal nested procedure obtained through dynamic programming. To elucidate these comparisons, we develop closed-form expressions for the optimal Sterrett procedure and provide a concise review of the prior literature for other commonly used procedures. We consider designs where the prevalence of disease is known as well as investigate the robustness of these procedures, when it is incorrectly assumed. This article provides a technical presentation that will be of interest to researchers as well as from a pedagogical perspective. Supplementary material for this article is available online. 相似文献
55.
In many toxicological assays, interactions between primary and secondary effects may cause a downturn in mean responses at high doses. In this situation, the typical monotonicity assumption is invalid and may be quite misleading. Prior literature addresses the analysis of response functions with a downturn, but so far as we know, this paper initiates the study of experimental design for this situation. A growth model is combined with a death model to allow for the downturn in mean doses. Several different objective functions are studied. When the number of treatments equals the number of parameters, Fisher information is found to be independent of the model of the treatment means and on the magnitudes of the treatments. In general, A- and DA-optimal weights for estimating adjacent mean differences are found analytically for a simple model and numerically for a biologically motivated model. Results on c-optimality are also obtained for estimating the peak dose and the EC50 (the treatment with response half way between the control and the peak response on the increasing portion of the response function). Finally, when interest lies only in the increasing portion of the response function, we propose composite D-optimal designs. 相似文献
56.
The aim of this article is to establish the optimal control of a periodic-review inventory system with two suppliers. One of them delivers orders immediately, the other one is unreliable delivering the orders immediately only with probability p ∈ (0, 1). Two cases are considered. In the first case, it is possible to order any inventory amount from each of suppliers. In the second case, the system budget is restricted. 相似文献
57.
By running the life tests at higher stress levels than normal operating conditions, accelerated life testing quickly yields information on the lifetime distribution of a test unit. The lifetime at the design stress is then estimated through extrapolation using a regression model. In constant-stress testing, a unit is tested at a fixed stress level until failure or the termination time point of the test, while step-stress testing allows the experimenter to gradually increase the stress levels at some pre-fixed time points during the test. In this article, the optimal k-level constant-stress and step-stress accelerated life tests are compared for the exponential failure data under Type-I censoring. The objective is to quantify the advantage of using the step-stress testing relative to the constant-stress one. A log-linear relationship between the mean lifetime parameter and stress level is assumed and the cumulative exposure model holds for the effect of changing stress in step-stress testing. The optimal design point is then determined under C-optimality, D-optimality, and A-optimality criteria. The efficiency of step-stress testing compared to constant-stress testing is discussed in terms of the ratio of optimal objective functions based on the information matrix. 相似文献
58.
This article studies the dispatch of consolidated shipments. Orders, following a batch Markovian arrival process, are received in discrete quantities by a depot at discrete time epochs. Instead of immediate dispatch, all outstanding orders are consolidated and shipped together at a later time. The decision of when to send out the consolidated shipment is made based on a “dispatch policy,” which is a function of the system state and/or the costs associated with that state. First, a tree structured Markov chain is constructed to record specific information about the consolidation process; the effectiveness of any dispatch policy can then be assessed by a set of long-run performance measures. Next, the effect on shipment consolidation of varying the order-arrival process is demonstrated through numerical examples and proved mathematically under some conditions. Finally, a heuristic algorithm is developed to determine a favorable parameter of a special set of dispatch policies, and the algorithm is proved to yield the overall optimal policy under certain conditions. 相似文献
59.
Henrik Jacobsen Kleven Claus Thustrup Kreiner Emmanuel Saez 《Econometrica : journal of the Econometric Society》2009,77(2):537-560
This paper analyzes the general nonlinear optimal income tax for couples, a multidimensional screening problem. Each couple consists of a primary earner who always participates in the labor market, but makes an hours‐of‐work choice, and a secondary earner who chooses whether or not to work. If second‐earner participation is a signal of the couple being better (worse) off, we prove that optimal tax schemes display a positive tax (subsidy) on secondary earnings and that the tax (subsidy) on secondary earnings decreases with primary earnings and converges to zero asymptotically. We present calibrated microsimulations for the United Kingdom showing that decreasing tax rates on secondary earnings is quantitatively significant and consistent with actual income tax and transfer programs. 相似文献
60.
We consider circular block designs for field-trials when there are two-sided spatial interference between neighbouring plots of the same blocks. The parameter of interest is total effects that is the sum of direct effect of treatment and neighbour effects, which correspond to the use of a single treatment in the whole field. We determine universally optimal approximate designs. When the number of blocks may be large, we propose efficient exact designs generated by a single sequence of treatment. We also give efficiency factors of the usual binary block neighbour balanced designs which can be used when the number of blocks is small. 相似文献