首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   179篇
  免费   3篇
  国内免费   1篇
管理学   76篇
人口学   6篇
丛书文集   7篇
理论方法论   6篇
综合类   32篇
社会学   13篇
统计学   43篇
  2024年   1篇
  2023年   2篇
  2021年   3篇
  2020年   3篇
  2019年   7篇
  2018年   5篇
  2017年   6篇
  2015年   5篇
  2014年   6篇
  2013年   21篇
  2012年   10篇
  2011年   10篇
  2010年   6篇
  2009年   11篇
  2008年   4篇
  2007年   5篇
  2006年   9篇
  2005年   8篇
  2004年   8篇
  2003年   3篇
  2002年   3篇
  2001年   9篇
  2000年   5篇
  1999年   5篇
  1998年   4篇
  1997年   7篇
  1996年   2篇
  1995年   3篇
  1994年   6篇
  1993年   1篇
  1991年   1篇
  1990年   1篇
  1987年   2篇
  1984年   1篇
排序方式: 共有183条查询结果,搜索用时 46 毫秒
51.
ROC analysis involving two large datasets is an important method for analyzing statistics of interest for decision making of a classifier in many disciplines. And data dependency due to multiple use of the same subjects exists ubiquitously in order to generate more samples because of limited resources. Hence, a two-layer data structure is constructed and the nonparametric two-sample two-layer bootstrap is employed to estimate standard errors of statistics of interest derived from two sets of data, such as a weighted sum of two probabilities. In this article, to reduce the bootstrap variance and ensure the accuracy of computation, Monte Carlo studies of bootstrap variability were carried out to determine the appropriate number of bootstrap replications in ROC analysis with data dependency. It is suggested that with a tolerance 0.02 of the coefficient of variation, 2,000 bootstrap replications be appropriate under such circumstances.  相似文献   
52.
Central to many inferential situations is the estimation of rational functions of parameters. The mainstream in statistics and econometrics estimates these quantities based on the plug‐in approach without consideration of the main objective of the inferential situation. We propose the Bayesian Minimum Expected Loss (MELO) approach focusing explicitly on the function of interest, and calculating its frequentist variability. Asymptotic properties of the MELO estimator are similar to the plug‐in approach. Nevertheless, simulation exercises show that our proposal is better in situations characterised by small sample sizes and/or noisy data sets. In addition, we observe in the applications that our approach gives lower standard errors than frequently used alternatives when data sets are not very informative.  相似文献   
53.
The performance of a retail store depends on its ability to attract customer traffic, match labor with incoming traffic, and convert the incoming traffic into sales. Retailers make significant investments in marketing activities (such as advertising) to bring customers into their stores and in‐store labor to convert that traffic into sales. Thus, a common trade‐off that retail store managers face concerns the allocation of a store's limited budget between advertising and labor to enhance store‐level sales. To explore that trade‐off, we develop a centralized model to allocate limited store budget between store labor and advertising with the objective of maximizing store sales. We find that a store's inherent potential to drive traffic plays an important role, among other factors, in the relative allocation between advertising and store labor. We also find that as advertising instruments become more effective in bringing traffic to stores, managers should not always capitalize this effectiveness by increasing their existing allocations to advertising. In addition, we discuss a decentralized setting where budget allocation decisions cannot be enforced by a store manager and present a simple mechanism that can achieve the centralized solution. In an extension, we address the budget allocation problem in the presence of marketing efforts to shift store traffic from peak to off peak hours and show that our initial findings are robust. Further, we illustrate how the solution from the budget allocation model can be used to facilitate store level sales force planning/scheduling decisions. Based on the results of our model, we present several insights that can help managers in budget allocation and sales force planning.  相似文献   
54.
Biomarkers such as DNA adducts have significant potential to improve quantitative risk assessment by characterizing individual differences in metabolism of genotoxins and DNA repair and accounting for some of the factors that could affect interindividual variation in cancer risk. Inherent uncertainty in laboratory measurements and within-person variability of DNA adduct levels over time are putatively unrelated to cancer risk and should be subtracted from observed variation to better estimate interindividual variability of response to carcinogen exposure. A total of 41 volunteers, both smokers and nonsmokers, were asked to provide a peripheral blood sample every 3 weeks for several months in order to specifically assess intraindividual variability of polycyclic aromatic hydrocarbon (PAH)-DNA adduct levels. The intraindividual variance in PAH-DNA adduct levels, together with measurement uncertainty (laboratory variability and unaccounted for differences in exposure), constituted roughly 30% of the overall variance. An estimated 70% of the total variance was contributed by interindividual variability and is probably representative of the true biologic variability of response to carcinogenic exposure in lymphocytes. The estimated interindividual variability in DNA damage after subtracting intraindividual variability and measurement uncertainty was 24-fold. Inter-individual variance was higher (52-fold) in persons who constitutively lack the Glutathione S-Transferase M1 (GSTM1) gene which is important in the detoxification pathway of PAH. Risk assessment models that do not consider the variability of susceptibility to DNA damage following carcinogen exposure may underestimate risks to the general population, especially for those people who are most vulnerable.  相似文献   
55.
语言本身的非系统性与文学创作的审美特性是促进文学语言产生变异的重要原因。在创作过程中,作家要把握和表现那些有多种可能性、无限发展性的事物,要写出自己的独特感受,使文学语言只得“化常为变”,而语言的非系统性则提供了物质上的可能。文学语言的变异性表现为语音变异、语义变异和语法变异。  相似文献   
56.
Dose assessment is an important issue from the viewpoints of protecting people from radiation exposure and managing postaccident situations adequately. However, the radiation doses received by people cannot be determined with complete accuracy because of the uncertainties and the variability associated with any process of defining individual characteristics and in the dose assessment process itself. In this study, a dose assessment model was developed based on measurements and surveys of individual doses and relevant contributors (i.e., ambient dose rates and behavior patterns) in Fukushima City for four population groups: Fukushima City Office staff, Senior Citizens’ Club, Contractors’ Association, and Agricultural Cooperative. In addition, probabilistic assessments were performed for these population groups by considering the spatial variability of contamination and interpopulation differences resulting from behavior patterns. As a result of comparison with the actual measurements, the assessment results for participants from the Fukushima City Office agreed with the measured values, thereby validating the model and the approach. Although the assessment results obtained for the Senior Citizens’ Club and the Agricultural Cooperative differ partly from the measured values, by addressing further considerations in terms of dose reduction effects due to decontamination and the impact of additional exposure sources in agricultural fields, these results can be improved. By contrast, the measurements obtained for the participants from the Contractors’ Association were not reproduced well in the present study. To assess the doses to this group, further investigations of association members’ work activities and the related dose reduction effects are needed.  相似文献   
57.
Environmental tobacco smoke (ETS) is a major contributor to indoor human exposures to fine particulate matter of 2.5 μm or smaller (PM2.5). The Stochastic Human Exposure and Dose Simulation for Particulate Matter (SHEDS‐PM) Model developed by the U.S. Environmental Protection Agency estimates distributions of outdoor and indoor PM2.5 exposure for a specified population based on ambient concentrations and indoor emissions sources. A critical assessment was conducted of the methodology and data used in SHEDS‐PM for estimation of indoor exposure to ETS. For the residential microenvironment, SHEDS uses a mass‐balance approach, which is comparable to best practices. The default inputs in SHEDS‐PM were reviewed and more recent and extensive data sources were identified. Sensitivity analysis was used to determine which inputs should be prioritized for updating. Data regarding the proportion of smokers and “other smokers” and cigarette emission rate were found to be important. SHEDS‐PM does not currently account for in‐vehicle ETS exposure; however, in‐vehicle ETS‐related PM2.5 levels can exceed those in residential microenvironments by a factor of 10 or more. Therefore, a mass‐balance‐based methodology for estimating in‐vehicle ETS PM2.5 concentration is evaluated. Recommendations are made regarding updating of input data and algorithms related to ETS exposure in the SHEDS‐PM model. Interindividual variability for ETS exposure was quantified. Geographic variability in ETS exposure was quantified based on the varying prevalence of smokers in five selected locations in the United States.  相似文献   
58.
Robust Control in Water Management   总被引:2,自引:0,他引:2  
Since surface water flows are often stochastic, there is a role for water reservoirs in protecting users against uncertainty. We assume uncertainty regarding the probability distribution for the stochastic variable. Thus the decision allows for a range of approximate models that could be true, and the problem can be solved using robust optimal control. This paper analyses the implications of a robust framework on resource management decisions, using the case of water as an illustration. Robust choices are compared with those of a benchmark stochastic model and the emergence of precautionary behaviour is discussed.  相似文献   
59.
Traditionally, microbial risk assessors have used point estimates to evaluate the probability that an individual will become infected. We developed a quantitative approach that shifts the risk characterization perspective from point estimate to distributional estimate, and from individual to population. To this end, we first designed and implemented a dynamic model that tracks traditional epidemiological variables such as the number of susceptible, infected, diseased, and immune, and environmental variables such as pathogen density. Second, we used a simulation methodology that explicitly acknowledges the uncertainty and variability associated with the data. Specifically, the approach consists of assigning probability distributions to each parameter, sampling from these distributions for Monte Carlo simulations, and using a binary classification to assess the output of each simulation. A case study is presented that explores the uncertainties in assessing the risk of giardiasis when swimming in a recreational impoundment using reclaimed water. Using literature-based information to assign parameters ranges, our analysis demonstrated that the parameter describing the shedding of pathogens by infected swimmers was the factor that contributed most to the uncertainty in risk. The importance of other parameters was dependent on reducing the a priori range of this shedding parameter. By constraining the shedding parameter to its lower subrange, treatment efficiency was the parameter most important in predicting whether a simulation resulted in prevalences above or below non outbreak levels. Whereas parameters associated with human exposure were important when the shedding parameter was constrained to a higher subrange. This Monte Carlo simulation technique identified conditions in which outbreaks and/or nonoutbreaks are likely and identified the parameters that most contributed to the uncertainty associated with a risk prediction.  相似文献   
60.
Recently, the lag phase research in predictive microbiology is focusing more on the individual cell variability, especially for pathogenic microorganisms that typically occur in very low contamination levels, like Listeria monocytogenes. In this study, the effect of this individual cell lag phase variability was introduced in an exposure assessment study for L. monocytogenes in a liver paté. A basic framework was designed to estimate the contamination level of paté at the time of consumption, taking into account the frequency of contamination and the initial contamination levels of paté at retail. Growth was calculated on paté units of 150 g, comparing an individual-based approach with a classical population-based approach. The two different protocols were compared using simulations. If only the individual cell lag variability was taken into account, important differences were observed in cell density at the time of consumption between the individual-based approach and the classical approach, especially at low inoculum levels, resulting in high variability when using the individual-based approach. Although, when all variable factors were taken into account, no significant differences were observed between the different approaches, allowing the conclusion that the individual cell lag phase variability was overruled by the global variability of the exposure assessment framework. Even in more extreme conditions like a low inoculum level or a low water activity, no differences were created in cell density at the time of consumption between the individual-based approach and the classical approach. This means that the individual cell lag phase variability of L. monocytogenes has important consequences when studying specific growth cases, especially when the applied inoculum levels are low, but when performing more general exposure assessment studies, the variability between the individual cell lag phases is too limited to have a major impact on the total exposure assessment.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号