首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
中国特色大国外交,是近年中国外交理念探索性的新尝试。本文对外交部官方辞令进行提炼,构建出衡量中国与贸易各国外交亲疏的时变等级指数,并从外交建立渊源、外交维系过程两个维度论证其有效性。基于一个包含外交因素的贸易引力模型,本文提出了“外交—贸易”驱动假说,继而利用1992-2013年中国与166个国家的双边贸易数据,为这一假说提供系统性的经验证据。研究表明,外交关系每提升一个等级,中国对他国商品的开放程度能显著提高23%;即使考察了外交内生问题、模型时点转变以及指标更替等多个方面,仍能够证实假说成立,确保了结论的稳健性。进一步研究发现,外交驱动作用不仅表现为“倒U型”的非线性传导模式,而且存在显著的“强国”与“弱国”结构突变机制,即弱国外交以发展经济为主、强国外交重视泛经济层面的转型调整。这一结果意味着,在总体国际关系趋好的大形势下,中国的外交力量呈现“从经济转向泛经济合作”的大国格局。  相似文献   

2.
Dead recoveries of marked animals are commonly used to estimate survival probabilities. Band‐recovery models can be parameterized either by r (the probability of recovering a band conditional on death of the animal) or by f (the probability that an animal will be killed, retrieved, and have its band reported). The T parametrization can be implemented in a capture‐recapture framework with two states (alive and newly dead), mortality being the transition probability between the two states. The authors show here that the f parametrization can also be implemented in a multistate framework by imposing simple constraints on some parameters. They illustrate it using data on the mallard and the snow goose. However, they mention that because it does not entirely separate the individual survival and encounter processes, the f parametrization must be used with care on reduced models, or in the presence of estimates at the boundary of the parameter space. As they show, a multistate framework allows the use of powerful software for model fitting or testing the goodness‐of‐fit of models; it also affords the implementation of complex models such as those based on mixture of information or uncertain states  相似文献   

3.
4.
This study investigates the use of stratification to improve discrimination when prior probabilities vary across strata of a population of interest. Sources of heterogeneity in prior probabilities include differences in geographic locale, age differences in the population studied, or differences in the time component of the data collected. The article suggests using logistic regression both to identify the underlying stratification and to estimate prior probabilities. A simulation study compares misclassification rates under two alternative stratification schemes with the traditional discriminant approach that ignores stratification in favor of pooled prior estimates. The simulations show that large asymptotic gains can be realized by stratification, and that these gains can be realized in finite samples, given moderate differences in prior probabilities.  相似文献   

5.
Homotheticity induces a dramatic statistical bias in the estimates of the intratemporal and intertemporal substitutions. I find potent support in favor of nonhomotheticity in aggregate consumption data, with nondurable goods being necessities and durable goods luxuries. I obtain the intertemporal substitutability negligible (0.04), a magnitude close to Hall’s (1988) original estimate, and the intratemporal substitutability between nondurable goods and service flow from the stock of durable goods small as well (0.18). Despite that, due to the secular decline of the rental cost, the budget share of durable goods appears trendless.  相似文献   

6.
We analyze publicly available data to estimate the causal effects of military interventions on the homicide rates in certain problematic regions in Mexico. We use the Rubin causal model to compare the post-intervention homicide rate in each intervened region to the hypothetical homicide rate for that same year had the military intervention not taken place. Because the effect of a military intervention is not confined to the municipality subject to the intervention, a nonstandard definition of units is necessary to estimate the causal effect of the intervention under the standard no-interference assumption of stable-unit treatment value assumption (SUTVA). Donor pools are created for each missing potential outcome under no intervention, thereby allowing for the estimation of unit-level causal effects. A multiple imputation approach accounts for uncertainty about the missing potential outcomes.  相似文献   

7.
In Part I the context of military planning is examined from the standpoint of uncertainty. There is a special focus on the uncertainty surrounding the subject of C3 (command, control and communications). It is argued that military planning is a fuzzy process. Tools being developed within the OJCS to cope with the subject of tactical C3 are introduced. In Part II case histories of two decision aids, which deal directly with uncertainty from two distinct vantage points, are presented.  相似文献   

8.
This article introduces a fast cross-validation algorithm that performs wavelet shrinkage on data sets of arbitrary size and irregular design and also simultaneously selects good values of the primary resolution and number of vanishing moments.We demonstrate the utility of our method by suggesting alternative estimates of the conditional mean of the well-known Ethanol data set. Our alternative estimates outperform the Kovac-Silverman method with a global variance estimate by 25% because of the careful selection of number of vanishing moments and primary resolution. Our alternative estimates are simpler than, and competitive with, results based on the Kovac-Silverman algorithm equipped with a local variance estimate.We include a detailed simulation study that illustrates how our cross-validation method successfully picks good values of the primary resolution and number of vanishing moments for unknown functions based on Walsh functions (to test the response to changing primary resolution) and piecewise polynomials with zero or one derivative (to test the response to function smoothness).  相似文献   

9.
In this paper, we obtain balanced resolution V plans for 2m factorial experiments (4 ≤ m ≤ 8), which have an additional feature. Instead of assuming that the three factor and higher order effects are all zero, we assume that there is at most one nonnegligible effect among them; however, we do not know which particular effect is nonnegligible. The problem is to search which effect is non-negligible and to estimate it, along with estimating the main effects and two factor interactions etc., as in an ordinary resolution V design. For every value of N (the number of treatments) within a certain practical range, we present a design using which the search and estimation can be carried out. (Of course, as in all statistical problems, the probability of correct search will depend upon the size of “error” or “noise” present in the observations. However, the designs obtained are such that, at least in the noiseless case, this probability equals 1.) It is found that many of these designs are identical with optimal balanced resolution V designs obtained earlier in the work of Srivastava and Chopra.  相似文献   

10.
The mixture of Type I and Type I1 censoring schemes, called the hybrid censoring, is quite important in life–testing experiments. Epstein(1954, 1960) introduced this testing scheme and proposed a two–sided confidence interval to estimate the mean lifetime, θ, when the underlying lifetime distribution is assumed to be exponential. There are some two–sided confidence intervals and credible intervals proposed by Fairbanks et al. (1982) and Draper and Guttman (1987) respectively. In this paper we obtain the exact two–sided confidence interval of θ following the approach of Chen and Bhattacharya (1988). We also obtain the asymptotic confidence intervals in the Hybrid censoring case. It is important to observe that the results for Type I and Type II censoring schemes can be obtained as particular cases of the Hybrid censoring scheme. We analyze one data set and compare different methods by Monte Carlo simulations.  相似文献   

11.
In a clinical trial comparing two treatment groups, one commonly‐used endpoint is time to death. Another is time until the first nonfatal event (if there is one) or until death (if not). Both endpoints have drawbacks. The wrong choice may adversely affect the value of the study by impairing power if deaths are too few (with the first endpoint) or by lessening the role of mortality if not (with the second endpoint). We propose a compromise that provides a simple test based on the time to death if the patient has died or time since randomization augmented by an increment otherwise. The test applies the ordinary two‐sample Wilcoxon statistic to these values. The formula for the increment (the same for experimental and control patients) must be specified before the trial starts. In the simplest (and perhaps most useful) case, the increment assumes only two values, according to whether or not the (surviving) patient had a nonfatal event. More generally, the increment depends on the time of the first nonfatal event, if any, and the time since randomization. The test has correct Type I error even though it does not handle censoring in a customary way. For conditions where investigators would face no easy (advance) choice between the two older tests, simulation results favor the new test. An example using a renal‐cancer trial is presented. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

12.
ABSTRACT

Acceptance sampling plans offered by ISO 2859-1 are far from optimal under the conditions for statistical verification in modules F and F1 as prescribed by Annex II of the Measuring Instruments Directive (MID) 2014/32/EU, resulting in sample sizes that are larger than necessary. An optimised single-sampling scheme is derived, both for large lots using the binomial distribution and for finite-sized lots using the exact hypergeometric distribution, resulting in smaller sample sizes that are economically more efficient while offering the full statistical protection required by the MID.  相似文献   

13.
We consider two problems concerning locating change points in a linear regression model. One involves jump discontinuities (change-point) in a regression model and the other involves regression lines connected at unknown points. We compare four methods for estimating single or multiple change points in a regression model, when both the error variance and regression coefficients change simultaneously at the unknown point(s): Bayesian, Julious, grid search, and the segmented methods. The proposed methods are evaluated via a simulation study and compared via some standard measures of estimation bias and precision. Finally, the methods are illustrated and compared using three real data sets. The simulation and empirical results overall favor both the segmented and Bayesian methods of estimation, which simultaneously estimate the change point and the other model parameters, though only the Bayesian method is able to handle both continuous and dis-continuous change point problems successfully. If it is known that regression lines are continuous then the segmented method ranked first among methods.  相似文献   

14.
In many of the applied sciences, it is common that the forms of empirical relationships are almost completely unknown prior to study. Scatterplot smoothers used in nonparametric regression methods have considerable potential to ease the burden of model specification that a researcher would otherwise face in this situation. Occasionally the researcher will know the sign of the first or second derivatives, or both. This article develops a smoothing method that can incorporate this kind of information. I show that cubic regression splines with bounds on the coefficients offer a simple and effective approximation to monotonic, convex or concave transformations. I also discuss methods for testing whether the constraints should be imposed. Monte Carlo results indicate that this method, dubbed CoSmo, has a lower approximation error than either locally weighted regression or two other constrained smoothing methods. CoSmo has many potential applications and should be especially useful in applied econometrics. As an illustration, I apply CoSmo in a multivariate context to estimate a hedonic price function and to test for concavity in one of the variables.  相似文献   

15.
Methods of nonparametric inference are proposed for a process with two transient and three absorbing states. It is assumed that the time of transitions between the transient states are unobservable. One area of applications is in epidemiology where the transient states correspond to healthy and ill, while the absorbing states correspond to types of death. It is the onset of illness which is not observable. An estimate is given for a cumulative hazard rate between the transient states, the exit hazard rates are estimated at a specific point in time and a statistic for comparing exit rates from the transient states is given.  相似文献   

16.
The vast majority of reliability analyses assume that ccmponents and system are in either of two states: functioning or failed. However, in many real life situations we are actually able to distinguish among various 'Ilevels of performance" for both system and components. For such situations, the existing dichotomous model is a gross oversimplification and so models assuming degradable (multistate) systems and components are preferable since they are closer to reality.We present a survey of recent papers which treat the more I sophisticated and more realistic models in which components and systems may assume many states ranging from perfect functioning to complete failure. Our survey updates and complements a previous survey by El-Neweihi and Proschan (1980). Some new results are included.  相似文献   

17.
Using a sample of medical malpractice insurance claims closed between 1 October 1985 and 1 October 1989 in the USA, we estimate the impact of legal reforms on the longevity of disputes, via a competing risks model that accounts for length-biased sampling and a finite sampling horizon. We find that only the 'English rule'-a rule which requires the loser at trial to pay all legal expenses-shortens the duration of disputes. Our results for this law also show that failure to correct for length-biased sampling can incorrectly imply that the English rule lengthens the time needed for settlement and litigation. Our estimates also suggest that tort reforms that place additional procedural hurdles in the plaintiff s' paths tend to lengthen the time to disposition. Here, correction for a finite sampling horizon substantially changes the inferences with regard to the eff ect of this reform on duration.  相似文献   

18.
In practice, different practitioners will use different Phase I samples to estimate the process parameters, which will lead to different Phase II control chart's performance. Researches refer to this variability as between-practitioners-variability of control charts. Since between-practitioners-variability is important in the design of the CUSUM median chart with estimated process parameters, the standard deviation of average run length (SDARL) will be used to study its properties. It is shown that the CUSUM median chart requires a larger amount of Phase I samples to sufficiently reduce the variation in the in-control ARL of the CUSUM median chart. Considering the limitation of the amount of the Phase I samples, a bootstrap approach is also used here to adjust the control limits of the CUSUM median chart. Comparisons are made for the CUSUM and Shewhart median charts with estimated parameters when using the adjusted- and unadjusted control limits and some conclusions are made.  相似文献   

19.
Profile monitoring is the use of control charts for cases in which the quality of a process or product can be characterized by a functional relationship between a response variable and one or more explanatory variables. Unlike the linear profile's simple structure, the nonlinear profile has relatively less attainment because of high complexity. Regression model is the initial method to analyze the phase I of nonlinear profiles, but it lacks sensitivity for local characteristic changes. This article presents a strategy comprising two major components: data-segmentation, to concisely detect the location of local change by overlaying grid points onto horizontal axis, and change-point detection via the maximum likelihood estimate. Simulated data set of a polynomial profile is used to illustrate the effectiveness of the proposed strategy, and is compared with Williams' T2 multi-variable statistics.  相似文献   

20.
Type I and Type II censored data arise frequently in controlled laboratory studies concerning time to a particular event (e.g., death of an animal or failure of a physical device). Log-location-scale distributions (e.g., Weibull, lognormal, and loglogistic) are commonly used to model the resulting data. Maximum likelihood (ML) is generally used to obtain parameter estimates when the data are censored. The Fisher information matrix can be used to obtain large-sample approximate variances and covariances of the ML estimates or to estimate these variances and covariances from data. The derivations of the Fisher information matrix proceed differently for Type I (time censoring) and Type II (failure censoring) because the number of failures is random in Type I censoring, but length of the data collection period is random in Type II censoring. Under regularity conditions (met with the above-mentioned log-location-scale distributions), we outline the different derivations and show that the Fisher information matrices for Type I and Type II censoring are asymptotically equivalent.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号