首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   781篇
  免费   20篇
  国内免费   2篇
管理学   41篇
人口学   7篇
丛书文集   3篇
理论方法论   2篇
综合类   32篇
统计学   718篇
  2023年   5篇
  2022年   1篇
  2021年   7篇
  2020年   11篇
  2019年   33篇
  2018年   26篇
  2017年   50篇
  2016年   27篇
  2015年   17篇
  2014年   30篇
  2013年   240篇
  2012年   67篇
  2011年   20篇
  2010年   26篇
  2009年   21篇
  2008年   23篇
  2007年   21篇
  2006年   10篇
  2005年   25篇
  2004年   17篇
  2003年   6篇
  2002年   16篇
  2001年   10篇
  2000年   18篇
  1999年   8篇
  1998年   10篇
  1997年   12篇
  1996年   7篇
  1995年   2篇
  1994年   4篇
  1993年   3篇
  1992年   2篇
  1991年   3篇
  1990年   2篇
  1989年   2篇
  1988年   4篇
  1987年   1篇
  1986年   4篇
  1985年   3篇
  1983年   1篇
  1980年   3篇
  1979年   2篇
  1976年   1篇
  1975年   2篇
排序方式: 共有803条查询结果,搜索用时 15 毫秒
1.
Modeling spatial overdispersion requires point process models with finite‐dimensional distributions that are overdisperse relative to the Poisson distribution. Fitting such models usually heavily relies on the properties of stationarity, ergodicity, and orderliness. In addition, although processes based on negative binomial finite‐dimensional distributions have been widely considered, they typically fail to simultaneously satisfy the three required properties for fitting. Indeed, it has been conjectured by Diggle and Milne that no negative binomial model can satisfy all three properties. In light of this, we change perspective and construct a new process based on a different overdisperse count model, namely, the generalized Waring (GW) distribution. While comparably tractable and flexible to negative binomial processes, the GW process is shown to possess all required properties and additionally span the negative binomial and Poisson processes as limiting cases. In this sense, the GW process provides an approximate resolution to the conundrum highlighted by Diggle and Milne.  相似文献   
2.
3.
We investigate the problem of estimating geodesic tortuosity and constrictivity as two structural characteristics of stationary random closed sets. They are of central importance for the analysis of effective transport properties in porous or composite materials. Loosely speaking, geodesic tortuosity measures the windedness of paths, whereas the notion of constrictivity captures the appearance of bottlenecks resulting from narrow passages within a given materials phase. We first provide mathematically precise definitions of these quantities and introduce appropriate estimators. Then, we show strong consistency of these estimators for unboundedly growing sampling windows. In order to apply our estimators to real data sets, the extent of edge effects needs to be controlled. This is illustrated using a model for a multiphase material that is incorporated in solid oxide fuel cells.  相似文献   
4.
We propose testing procedures for the hypothesis that a given set of discrete observations may be formulated as a particular time series of counts with a specific conditional law. The new test statistics incorporate the empirical probability-generating function computed from the observations. Special emphasis is given to the popular models of integer autoregression and Poisson autoregression. The asymptotic properties of the proposed test statistics are studied under the null hypothesis as well as under alternatives. A Monte Carlo power study on bootstrap versions of the new methods is included as well as real-data examples.  相似文献   
5.
This article proposes several estimators for estimating the ridge parameter k based on Poisson ridge regression (RR) model. These estimators have been evaluated by means of Monte Carlo simulations. As performance criteria, we have calculated the mean squared error (MSE), the mean value, and the standard deviation of k. The first criterion is commonly used, while the other two have never been used when analyzing Poisson RR. However, these performance criteria are very informative because, if several estimators have an equal estimated MSE, then those with low average value and standard deviation of k should be preferred. Based on the simulated results, we may recommend some biasing parameters that may be useful for the practitioners in the field of health, social, and physical sciences.  相似文献   
6.
Abstract.  Recurrent event data are largely characterized by the rate function but smoothing techniques for estimating the rate function have never been rigorously developed or studied in statistical literature. This paper considers the moment and least squares methods for estimating the rate function from recurrent event data. With an independent censoring assumption on the recurrent event process, we study statistical properties of the proposed estimators and propose bootstrap procedures for the bandwidth selection and for the approximation of confidence intervals in the estimation of the occurrence rate function. It is identified that the moment method without resmoothing via a smaller bandwidth will produce a curve with nicks occurring at the censoring times, whereas there is no such problem with the least squares method. Furthermore, the asymptotic variance of the least squares estimator is shown to be smaller under regularity conditions. However, in the implementation of the bootstrap procedures, the moment method is computationally more efficient than the least squares method because the former approach uses condensed bootstrap data. The performance of the proposed procedures is studied through Monte Carlo simulations and an epidemiological example on intravenous drug users.  相似文献   
7.
Generalized method of moments (GMM) is used to develop tests for discriminating discrete distributions among the two-parameter family of Katz distributions. Relationships involving moments are exploited to obtain identifying and over-identifying restrictions. The asymptotic relative efficiencies of tests based on GMM are analyzed using the local power approach and the approximate Bahadur efficiency. The paper also gives results of Monte Carlo experiments designed to check the validity of the theoretical findings and to shed light on the small sample properties of the proposed tests. Extensions of the results to compound Poisson alternative hypotheses are discussed.  相似文献   
8.
Impacts of complex emergencies or relief interventions have often been evaluated by absolute mortality compared to international standardized mortality rates. A better evaluation would be to compare with local baseline mortality of the affected populations. A projection of population-based survival data into time of emergency or intervention based on information from before the emergency may create a local baseline reference. We find a log-transformed Gaussian time series model where standard errors of the estimated rates are included in the variance to have the best forecasting capacity. However, if time-at-risk during the forecasted period is known then forecasting might be done using a Poisson time series model with overdispersion. Whatever, the standard error of the estimated rates must be included in the variance of the model either in an additive form in a Gaussian model or in a multiplicative form by overdispersion in a Poisson model. Data on which the forecasting is based must be modelled carefully concerning not only calendar-time trends but also periods with excessive frequency of events (epidemics) and seasonal variations to eliminate residual autocorrelation and to make a proper reference for comparison, reflecting changes over time during the emergency. Hence, when modelled properly it is possible to predict a reference to an emergency-affected population based on local conditions. We predicted childhood mortality during the war in Guinea-Bissau 1998-1999. We found an increased mortality in the first half-year of the war and a mortality corresponding to the expected one in the last half-year of the war.  相似文献   
9.
Zusammenfassung: In diesem Artikel wird der Weg von einem univariaten gemischten Poisson–Prozess, der in vielen Bereichen zum Z?hlen von Ereignissen benutzt wird, zu einem bivariaten gemischten Poisson–Prozess aufgezeigt. Dazu werden einige Eigenschaften des bivariaten Prozesses angegeben. Im zweiten Teil der Arbeit wird gezeigt, wie mit Hilfe dieses Prozesses der übergang von einem herk?mmlichen Bonus–Malus–System in der Kraftfahrthaftpflichtversicherung zu einem Bonus–Malus–System mit Berücksichtigung der Schadenart beschritten werden kann. Dazu wird zuerst eine Modellprüfung der gegebenen Daten vorgenommen und sodann werden für verschiedene mischende Verteilungen die Verteilungsparameter gesch?tzt und Nettopr?mien angegeben sowie die Prognosegenauigkeit getestet.
Summary: In this paper we show that the model of the bivariate mixed Poisson process arises in a natural way from the univariate mixed Poisson process, which is used in several areas for counting certain events. Furthermore we state some properties of the bivariate process. In the second part of the paper we illustrate how by means of the bivariate mixed Poisson process a bonus–malus system handling different types of accidents can be derived from the classical bonus–malus system in third–party liability insurance. To this end we first check the model on the given data and then estimate distribution parameters and compute net premiums for different mixing distributions as well as test the prediction probabilities.
* Vortrag am Dresdner Forum zur Versicherungsmathematik: Tarifierung in Erst- und Rückversicherung am 25. Juni 2004. Für die Unterstützung zu dieser Arbeit m?chte der Autor Lothar Partzsch, Klaus D. Schmidt (beide Dresden) und Friedemann Spies (München) recht herzlich danken.  相似文献   
10.
This note exhibits two independent random variables on integers, X1 and X2, such that neither X1 nor X2 has a generalized Poisson distribution, but X1 + X2 has. This contradicts statements made by Professor Consul in his recent book.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号