首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   59篇
  免费   1篇
管理学   18篇
人口学   2篇
理论方法论   2篇
综合类   2篇
社会学   16篇
统计学   20篇
  2023年   2篇
  2021年   1篇
  2020年   1篇
  2019年   1篇
  2018年   1篇
  2017年   2篇
  2015年   1篇
  2014年   2篇
  2013年   16篇
  2012年   8篇
  2011年   2篇
  2010年   4篇
  2009年   1篇
  2008年   6篇
  2007年   2篇
  2006年   3篇
  2005年   1篇
  2004年   1篇
  2001年   1篇
  1994年   1篇
  1989年   1篇
  1984年   1篇
  1979年   1篇
排序方式: 共有60条查询结果,搜索用时 46 毫秒
31.
32.
33.
To characterize the dependence of a response on covariates of interest, a monotonic structure is linked to a multivariate polynomial transformation of the central subspace (CS) directions with unknown structural degree and dimension. Under a very general semiparametric model formulation, such a sufficient dimension reduction (SDR) score is shown to enjoy the existence, optimality, and uniqueness up to scale and location in the defined concordance probability function. In light of these properties and its single-index representation, two types of concordance-based generalized Bayesian information criteria are constructed to estimate the optimal SDR score and the maximum concordance index. The estimation criteria are further carried out by effective computational procedures. Generally speaking, the outer product of gradients estimation in the first approach has an advantage in computational efficiency and the parameterization system in the second approach greatly reduces the number of parameters in estimation. Different from most existing SDR approaches, only one CS direction is required to be continuous in the proposals. Moreover, the consistency of structural degree and dimension estimators and the asymptotic normality of the optimal SDR score and maximum concordance index estimators are established under some suitable conditions. The performance and practicality of our methodology are also investigated through simulations and empirical illustrations.  相似文献   
34.
In assessing biosimilarity between two products, the question to ask is always “How similar is similar?” Traditionally, the equivalence of the means between products is the primary consideration in a clinical trial. This study suggests an alternative assessment for testing a certain percentage of the population of differences lying within a prespecified interval. In doing so, the accuracy and precision are assessed simultaneously by judging whether a two-sided tolerance interval falls within a prespecified acceptance range. We further derive an asymptotic distribution of the tolerance limits to determine the sample size for achieving a targeted level of power. Our numerical study shows that the proposed two-sided tolerance interval test controls the type I error rate and provides sufficient power. A real example is presented to illustrate our proposed approach.  相似文献   
35.

Increasing research evidence indicates that economic inequality leads the rich to be less generous than the poor. While compelling, the underling mechanism of the finding remains elusive. We conduct a laboratory experiment to investigate how inequality influences people’s behavior in a sharing game. We test varying causes of inequality to see how people share payoffs with others when inequality is caused respectively by chance, competition, and choice. The experiment result shows that the rich give less than the poor only when inequality is self-chosen. Yet, different from findings in previous studies, increasing inequality does not reinforce, but instead mitigates the negative relationship of income and giving. Our study suggests that research on the consequences of inequality should be careful on discerning whether self-choice of inequality could account for the spurious effect of inequality on people’s prosocial behavior.

  相似文献   
36.
Dissolution is one of the tests that is required and specified by the United States Pharmacopeia and National Formulary (USP/NF) to ensure that the drug products meet the standards of the identity, strength, quality, purity, and stability. The sponsors also establish the in‐house specifications for the mean and standard deviation of the dissolution rates to guarantee a high probability of passing the USP/NF dissolution test. However, the USP/NF dissolution test is a complicated three‐stage sampling plan that involves both the sample mean dissolution rate of all units and the dissolution rate of individual units. It turns out that the true probability of passing the USP/NF dissolution is formidable to compute analytically even when the population mean and variance of dissolution rates are known. It is not clear that previously proposed methods are the estimators of the true probability for passing the USP dissolution test. Therefore, we propose to employ a parametric bootstrap method in conjunction with the Monte Carlo simulation to obtain the sampling distribution of the estimated probabilities of passing the USP/NF dissolution test and hence the confidence interval for the passing probability. In addition, a procedure is proposed to test whether the true probability of passing the USP/NF dissolution test is greater than some specified value. A numerical example illustrates the proposed method. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   
37.
The performance of a decision making unit (DMU) can be evaluated in either a cross-sectional or a time-series manner, and data envelopment analysis (DEA) is a useful method for both types of evaluation. In order to eliminate the inconsistency caused by using different frontier facets to calculate efficiency, common-weights DEA models have been developed, under which a group of DMUs can be ranked for a specific period. This study proposes a common-weights DEA model for time-series evaluations to calculate the global Malmquist productivity index (MPI) so that the productivity changes of all DMUs have a common basis for comparison. The common-weights global MPI not only has sound properties, but also produces reliable results. The case of Taiwan forests after reorganization shows that the MPIs calculated from the conventional DEA model produce misleading results. The common-weights global MPI approach, on the other hand, correctly identifies districts with unsatisfactory performance before the reorganization and those with unsatisfactory productivity improvement after the reorganization.  相似文献   
38.
Materials formulation and processing research are important industrial processes and most materials know-how comes from physical experiments. Our impression, based on discussions with materials scientists, is that statistically planned experiments are infrequently used in materials research. This scientific and engineering area provides an excellent opportunity for both using the available techniques of statistically planned experiments, including mixture experiments, and identifying opportunities for collaborative research leading to further advances in statistical methods for scientists and engineers. This paper describes an application of SchefK's (1958) simplex approach for mixture experiments to formulation of high-temperature superconducting compounds. This example has given us better appreciation of the needs of materials scientists and has provided us opportunities for further collaborative research.  相似文献   
39.
The primary purpose of this study was to determine whether the effect of leadership on Six Sigma project success (PS) may be mediated by member cohesiveness (MC). The second objective was to examine whether the impact of MC on PS was moderated by resource management. The effects of the project manager's leadership styles and MC on PS were also investigated in this research. To address the primary aims, a survey was used to measure the project manager's leadership styles, MC and the overall success of Six Sigma projects. The analyses suggest that MC may serve as a mediator between leadership and PS. The results also indicate that resource allocation has a moderating effect on the relationship between MC and PS.  相似文献   
40.
A variation of the classical channel assignment problem is to assign a radio channel which is a nonnegative integer to each radio transmitter so that ??close?? transmitters must receive different channels and ??very close?? transmitters must receive channels that are at least two channels apart. The goal is to minimize the span of a feasible assignment. This channel assignment problem can be modeled with distance-dependent graph labelings. A k-L(2,1)-labeling of a graph G is a mapping f from the vertex set of G to the set {0,1,2,??,k} such that |f(x)?f(y)|??2 if d(x,y)=1 and $f(x)\not =f(y)$ if d(x,y)=2, where d(x,y) is the distance between vertices x and y in G. The minimum k for which G admits an k-L(2,1)-labeling, denoted by ??(G), is called the ??-number of G. Very little is known about ??-numbers of 3-regular graphs. In this paper we focus on an important subclass of 3-regular graphs called generalized Petersen graphs. For an integer n??3, a graph G is called a generalized Petersen graph of order n if and only if G is a 3-regular graph consisting of two disjoint cycles (called inner and outer cycles) of length n, where each vertex of the outer (resp. inner) cycle is adjacent to exactly one vertex of the inner (resp. outer) cycle. In 2002, Georges and Mauro conjectured that ??(G)??7 for all generalized Petersen graphs G of order n??7. Later, Adams, Cass and Troxell proved that Georges and Mauro??s conjecture is true for orders 7 and 8. In this paper it is shown that Georges and Mauro??s conjecture is true for generalized Petersen graphs of orders 9, 10, 11 and 12.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号