首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   219篇
  免费   4篇
管理学   28篇
民族学   2篇
人口学   14篇
丛书文集   1篇
理论方法论   13篇
综合类   1篇
社会学   61篇
统计学   103篇
  2024年   11篇
  2023年   1篇
  2022年   7篇
  2021年   2篇
  2020年   5篇
  2019年   21篇
  2018年   10篇
  2017年   19篇
  2016年   10篇
  2015年   4篇
  2014年   2篇
  2013年   43篇
  2012年   7篇
  2011年   7篇
  2010年   12篇
  2009年   5篇
  2008年   5篇
  2007年   6篇
  2006年   4篇
  2005年   5篇
  2004年   5篇
  2003年   6篇
  2002年   3篇
  2001年   5篇
  2000年   3篇
  1998年   1篇
  1997年   1篇
  1996年   1篇
  1994年   1篇
  1993年   2篇
  1989年   1篇
  1987年   1篇
  1985年   2篇
  1981年   2篇
  1980年   1篇
  1975年   1篇
  1967年   1篇
排序方式: 共有223条查询结果,搜索用时 0 毫秒
11.
Transductive methods are useful in prediction problems when the training dataset is composed of a large number of unlabeled observations and a smaller number of labeled observations. In this paper, we propose an approach for developing transductive prediction procedures that are able to take advantage of the sparsity in the high dimensional linear regression. More precisely, we define transductive versions of the LASSO (Tibshirani, 1996) and the Dantzig Selector (Candès and Tao, 2007). These procedures combine labeled and unlabeled observations of the training dataset to produce a prediction for the unlabeled observations. We propose an experimental study of the transductive estimators that shows that they improve the LASSO and Dantzig Selector in many situations, and particularly in high dimensional problems when the predictors are correlated. We then provide non-asymptotic theoretical guarantees for these estimation methods. Interestingly, our theoretical results show that the Transductive LASSO and Dantzig Selector satisfy sparsity inequalities under weaker assumptions than those required for the “original” LASSO.  相似文献   
12.
In this paper, and based on a progressive type-II censored sample from the generalized Rayleigh (GR) distribution, we consider the problem of estimating the model parameters and predicting the unobserved removed data. Maximum likelihood and Bayesian approaches are used to estimate the scale and shape parameters. The Gibbs and Metropolis samplers are used to predict the life lengths of the removed units in multiple stages of the progressively censored sample. Artificial and real data analyses have been performed for illustrative purposes.  相似文献   
13.
This paper explores the utility of different approaches for modeling longitudinal count data with dropouts arising from a clinical study for the treatment of actinic keratosis lesions on the face and balding scalp. A feature of these data is that as the disease for subjects on the active arm improves their data show larger dispersion compared with those on the vehicle, exhibiting an over‐dispersion relative to the Poisson distribution. After fitting the marginal (or population averaged) model using the generalized estimating equation (GEE), we note that inferences from such a model might be biased as dropouts are treatment related. Then, we consider using a weighted GEE (WGEE) where each subject's contribution to the analysis is weighted inversely by the subject's probability of dropout. Based on the model findings, we argue that the WGEE might not address the concerns about the impact of dropouts on the efficacy findings when dropouts are treatment related. As an alternative, we consider likelihood‐based inference where random effects are added to the model to allow for heterogeneity across subjects. Finally, we consider a transition model where, unlike the previous approaches that model the log‐link function of the mean response, we model the subject's actual lesion counts. This model is an extension of the Poisson autoregressive model of order 1, where the autoregressive parameter is taken to be a function of treatment as well as other covariates to induce different dispersions and correlations for the two treatment arms. We conclude with a discussion about model selection. Published in 2009 by John Wiley & Sons, Ltd.  相似文献   
14.
Conformal predictors, introduced by Vovk et al. (Algorithmic Learning in a Random World, Springer, New York, 2005), serve to build prediction intervals by exploiting a notion of conformity of the new data point with previously observed data. We propose a novel method for constructing prediction intervals for the response variable in multivariate linear models. The main emphasis is on sparse linear models, where only few of the covariates have significant influence on the response variable even if the total number of covariates is very large. Our approach is based on combining the principle of conformal prediction with the 1 penalized least squares estimator (LASSO). The resulting confidence set depends on a parameter ε>0 and has a coverage probability larger than or equal to 1−ε. The numerical experiments reported in the paper show that the length of the confidence set is small. Furthermore, as a by-product of the proposed approach, we provide a data-driven procedure for choosing the LASSO penalty. The selection power of the method is illustrated on simulated and real data.  相似文献   
15.
In this paper, we propose a new methodology for solving stochastic inversion problems through computer experiments, the stochasticity being driven by a functional random variables. This study is motivated by an automotive application. In this context, the simulator code takes a double set of simulation inputs: deterministic control variables and functional uncertain variables. This framework is characterized by two features. The first one is the high computational cost of simulations. The second is that the probability distribution of the functional input is only known through a finite set of realizations. In our context, the inversion problem is formulated by considering the expectation over the functional random variable. We aim at solving this problem by evaluating the model on a design, whose adaptive construction combines the so-called stepwise uncertainty reduction methodology with a strategy for an efficient expectation estimation. Two greedy strategies are introduced to sequentially estimate the expectation over the functional uncertain variable by adaptively selecting curves from the initial set of realizations. Both of these strategies consider functional principal component analysis as a dimensionality reduction technique assuming that the realizations of the functional input are independent realizations of the same continuous stochastic process. The first strategy is based on a greedy approach for functional data-driven quantization, while the second one is linked to the notion of space-filling design. Functional PCA is used as an intermediate step. For each point of the design built in the reduced space, we select the corresponding curve from the sample of available curves, thus guaranteeing the robustness of the procedure to dimension reduction. The whole methodology is illustrated and calibrated on an analytical example. It is then applied on the automotive industrial test case where we aim at identifying the set of control parameters leading to meet the pollutant emission standards of a vehicle.  相似文献   
16.
This paper discusses the development of public relations in a fast growing emerging country, United Arab Emirates. The making of the public relations profession in UAE has been affected tremendously by the socio-economic, educational and cultural development of the country. Ministries and government administrations established their in-house public relations departments and sections to respond to the growing demands of their various publics. Journalism and Mass Communication departments launched PR programs to meet the growing needs of the job market with qualified practitioners. International public relations agencies chose Dubai as a base for their activities in the UAE and the region. The future of public relations in the UAE is very promising. Although it is facing some problems, public relations is the job of the future.  相似文献   
17.
The Condorcet-Kemeny-Young statistical approach to vote aggregation is based on the assumption that voters have the same probability of comparing correctly two alternatives and that this probability is the same for any pair of alternatives. We relax the second part of this assumption by letting the probability of comparing correctly two alternatives be increasing with the distance between two alternatives in the allegedly true ranking. This leads to a rule in which the majority in favor of one alternative against another one is given a larger weight the larger the distance between the two alternatives in the true ranking, i.e., the larger the probability that the voters compare them correctly. This rule is not Condorcet consistent and does not satisfy local independence of irrelevant alternatives. Yet, it is anonymous, neutral, and paretian. It also appears that its performance in selecting the alternative most likely to be the best improves with the rate at which the probability increases.We would like to thank Michel Le Breton for his encouragement to examine this question and for his comments, as well as Philippe De Donder, Jean-Yves Duclos, Stephen Gordon, Cyril Téjédo and an anonymous referee for their comments.  相似文献   
18.
Recent promotion of city centre living within UK policy has led to a commensurate interest in city centre conditions and the opinions and experiences of the people who live there. An apposite, straightforward method to capture city centre residents' experiences and views is described in this article. We successfully combined a novel, under‐utilized visual technique (self‐directed photography) with qualitative methods (log‐sheets and interviews) in the form of a ‘photo‐survey’. A background to visual methodologies is presented in this article, alongside a critique of using the photo‐survey with 84 city centre residents to investigate environmental conditions and perceptions within three of the UK's major cities. The method provided a rich, detailed set of data, but also brought a number of noticeable benefits to the data collection process. The photo‐survey not only effectively captured and documented life in the city but also acted as an ‘agent for change’, evoking thoughts and feelings which ultimately encouraged participants to reflect on their existing perceptions and urban experiences. The study also raises some important considerations for future work undertaken with this method and with using photographs as a set of data, and proposes techniques for minimising potential problems.  相似文献   
19.
International trade in Syria is highly regulated through a combination of tariffs and non-tariff barriers. At 8% of the value of imports on average, effective tariffs are relatively low. However, non-tariff barriers to trade actually make Syria's trade restrictiveness very high. Comparing world and domestic prices of imports indeed suggests that non-tariff barriers increase the domestic price of imported goods by 17% on average, notably the result of significant quantitative restrictions. Using a computable general equilibrium model, the costs of NTBs on the Syrian economy are assessed. Simulations suggest that reallocation gains resulting from a complete removal of NTBs could be substantial. Accordingly, the key message from the analysis is that trade reform if it focuses only on tariff reduction will have limited growth benefits. On the contrary, if the Government abolishes the widespread non-tariff barriers to trade, including the elimination of quantitative trade restrictions, trade policy can become the central instrument to redress Syria's growth prospects.  相似文献   
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号