全文获取类型
收费全文 | 495篇 |
免费 | 2篇 |
专业分类
管理学 | 34篇 |
丛书文集 | 6篇 |
理论方法论 | 1篇 |
综合类 | 96篇 |
社会学 | 2篇 |
统计学 | 358篇 |
出版年
2022年 | 1篇 |
2020年 | 3篇 |
2019年 | 16篇 |
2018年 | 19篇 |
2017年 | 17篇 |
2016年 | 14篇 |
2015年 | 7篇 |
2014年 | 16篇 |
2013年 | 150篇 |
2012年 | 22篇 |
2011年 | 14篇 |
2010年 | 15篇 |
2009年 | 15篇 |
2008年 | 14篇 |
2007年 | 14篇 |
2006年 | 15篇 |
2005年 | 10篇 |
2004年 | 12篇 |
2003年 | 11篇 |
2002年 | 8篇 |
2001年 | 10篇 |
2000年 | 9篇 |
1999年 | 12篇 |
1998年 | 16篇 |
1997年 | 7篇 |
1996年 | 2篇 |
1995年 | 4篇 |
1994年 | 27篇 |
1993年 | 1篇 |
1992年 | 3篇 |
1991年 | 1篇 |
1989年 | 1篇 |
1988年 | 2篇 |
1985年 | 2篇 |
1982年 | 2篇 |
1981年 | 2篇 |
1978年 | 1篇 |
1977年 | 1篇 |
1976年 | 1篇 |
排序方式: 共有497条查询结果,搜索用时 31 毫秒
141.
The objective of the Interconnecting Highways problem is to construct roads of minimum total length to interconnect n given highways under the constraint that the roads can intersect each highway only at one point in a designated interval which is a line segment. We present a polynomial time approximation scheme for this problem by applying Arora's framework (Arora, 1998; also available from http:www.cs.princeton.edu/~arora). For every fixed c > 1 and given any n line segments in the plane, a randomized version of the scheme finds a
-approximation to the optimal cost in O(n
O(c)log(n) time. 相似文献
142.
Kert Viele 《Revue canadienne de statistique》2001,29(1):51-66
The author proposes a general method for evaluating the fit of a model for functional data. His approach consists of embedding the proposed model into a larger family of models, assuming the true process generating the data is within the larger family, and then computing a posterior distribution for the Kullback‐Leibler distance between the true and the proposed models. The technique is illustrated on biomechanical data reported by Ramsay, Flanagan & Wang (1995). It is developed in detail for hierarchical polynomial models such as those found in Lindley & Smith (1972), and is also generally applicable to longitudinal data analysis where polynomials are fit to many individuals. 相似文献
143.
144.
In this article, the generalized linear model for longitudinal data is studied. A generalized empirical likelihood method is proposed by combining generalized estimating equations and quadratic inference functions based on the working correlation matrix. It is proved that the proposed generalized empirical likelihood ratios are asymptotically chi-squared under some suitable conditions, and hence it can be used to construct the confidence regions of the parameters. In addition, the maximum empirical likelihood estimates of parameters are obtained, and their asymptotic normalities are proved. Some simulations are undertaken to compare the generalized empirical likelihood and normal approximation-based method in terms of coverage accuracies and average areas/lengths of confidence regions/intervals. An example of a real data is used for illustrating our methods. 相似文献
145.
This paper is concerned with the Bernstein estimator [Vitale, R.A. (1975), ‘A Bernstein Polynomial Approach to Density Function Estimation’, in Statistical Inference and Related Topics, ed. M.L. Puri, 2, New York: Academic Press, pp. 87–99] to estimate a density with support [0, 1]. One of the major contributions of this paper is an application of a multiplicative bias correction [Terrell, G.R., and Scott, D.W. (1980), ‘On Improving Convergence Rates for Nonnegative Kernel Density Estimators’, The Annals of Statistics, 8, 1160–1163], which was originally developed for the standard kernel estimator. Moreover, the renormalised multiplicative bias corrected Bernstein estimator is studied rigorously. The mean squared error (MSE) in the interior and mean integrated squared error of the resulting bias corrected Bernstein estimators as well as the additive bias corrected Bernstein estimator [Leblanc, A. (2010), ‘A Bias-reduced Approach to Density Estimation Using Bernstein Polynomials’, Journal of Nonparametric Statistics, 22, 459–475] are shown to be O(n?8/9) when the underlying density has a fourth-order derivative, where n is the sample size. The condition under which the MSE near the boundary is O(n?8/9) is also discussed. Finally, numerical studies based on both simulated and real data sets are presented. 相似文献
146.
Rasmus Tangsgaard Varneskov 《商业与经济统计学杂志》2016,34(1):1-22
This article develops a general multivariate additive noise model for synchronized asset prices and provides a multivariate extension of the generalized flat-top realized kernel estimators, analyzed earlier by Varneskov (2014), to estimate its quadratic covariation. The additive noise model allows for α-mixing dependent exogenous noise, random sampling, and an endogenous noise component that encompasses synchronization errors, lead-lag relations, and diurnal heteroscedasticity. The various components may exhibit polynomially decaying autocovariances. In this setting, the class of estimators considered is consistent, asymptotically unbiased, and mixed Gaussian at the optimal rate of convergence, n1/4. A simple finite sample correction based on projections of symmetric matrices ensures positive definiteness without altering the asymptotic properties of the estimators. It, thereby, guarantees the existence of nonlinear transformations of the estimated covariance matrix such as correlations and realized betas, which inherit the asymptotic properties from the flat-top realized kernel estimators. An empirically motivated simulation study assesses the choice of sampling scheme and projection rule, and it shows that flat-top realized kernels have a desirable combination of robustness and efficiency relative to competing estimators. Last, an empirical analysis of signal detection and out-of-sample predictions for a portfolio of six stocks of varying size and liquidity illustrates the use and properties of the new estimators. 相似文献
147.
In this paper, we study an algorithm to compute the non-parametric maximum likelihood estimator of stochastically ordered survival functions from case 2 interval-censored data. The algorithm, simply denoted by SQP (sequential quadratic programming), re-parameterizes the likelihood function to make the order constraints as a set of linear constraints, approximates the log-likelihood function as a quadratic function, and updates the estimate by solving a quadratic programming. We particularly consider two stochastic orderings, simple and uniform orderings, although the algorithm can also be applied to many other stochastic orderings. We illustrate the algorithm using the breast cancer data reported in Finkelstein and Wolfe (1985). 相似文献
148.
149.
This article proposes a variables sampling plan that can be applied for sampling inspection of resubmitted lots when the quality characteristic of interest follows the normal distribution. Resubmission of lots for inspection is allowed in some situations where the original inspection results are suspected or when the supplier or producer is allowed to opt for resampling as per the provisions of the contract, etc. The advantages of this proposed variables sampling plan over the existing single sampling variables plan are discussed. Tables are also constructed for the selection of optimal parameters of known and unknown standard deviation variables resampling scheme for specified two points on the operating characteristic curve, namely, the acceptable quality level and the limiting quality level along with the producer and consumer's risks. The optimization problem is formulated as a nonlinear programming where the objective function to be minimized is the average sample number and the constraints are related to lot acceptance probabilities at acceptable quality level and limiting quality level under the operating characteristic curve. 相似文献
150.
《Omega》2015
A virtual business problem is studied, in which a company-contractor outsources production to specialized subcontractors. Finances of the contractor and resource capacities of subcontractors are limited. The objective is to select subcontractors and distribute a part of the demanded production among them so that the profit of the contractor is maximized. A generalization of the knapsack problem, called Knapsack-of-Knapsacks (K-of-K), is used to model this situation, in which items have to be packed into small knapsacks and small knapsacks have to be packed into a large knapsack. A fully polynomial time approximation scheme is developed to solve the problem K-of-K. 相似文献