首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   320篇
  免费   6篇
管理学   39篇
民族学   5篇
人口学   30篇
理论方法论   44篇
综合类   1篇
社会学   170篇
统计学   37篇
  2023年   2篇
  2022年   1篇
  2021年   3篇
  2020年   14篇
  2019年   11篇
  2018年   11篇
  2017年   17篇
  2016年   14篇
  2015年   14篇
  2014年   8篇
  2013年   48篇
  2012年   18篇
  2011年   17篇
  2010年   10篇
  2009年   18篇
  2008年   13篇
  2007年   10篇
  2006年   11篇
  2005年   11篇
  2004年   9篇
  2003年   5篇
  2002年   7篇
  2001年   3篇
  2000年   4篇
  1999年   8篇
  1998年   8篇
  1997年   2篇
  1996年   6篇
  1995年   1篇
  1994年   7篇
  1993年   2篇
  1992年   4篇
  1989年   2篇
  1988年   1篇
  1987年   1篇
  1985年   1篇
  1984年   1篇
  1983年   1篇
  1982年   1篇
  1981年   1篇
排序方式: 共有326条查询结果,搜索用时 15 毫秒
171.
The emerging field of cancer radiomics endeavors to characterize intrinsic patterns of tumor phenotypes and surrogate markers of response by transforming medical images into objects that yield quantifiable summary statistics to which regression and machine learning algorithms may be applied for statistical interrogation. Recent literature has identified clinicopathological association based on textural features deriving from gray-level co-occurrence matrices (GLCM) which facilitate evaluations of gray-level spatial dependence within a delineated region of interest. GLCM-derived features, however, tend to contribute highly redundant information. Moreover, when reporting selected feature sets, investigators often fail to adjust for multiplicities and commonly fail to convey the predictive power of their findings. This article presents a Bayesian probabilistic modeling framework for the GLCM as a multivariate object as well as describes its application within a cancer detection context based on computed tomography. The methodology, which circumvents processing steps and avoids evaluations of reductive and highly correlated feature sets, uses latent Gaussian Markov random field structure to characterize spatial dependencies among GLCM cells and facilitates classification via predictive probability. Correctly predicting the underlying pathology of 81% of the adrenal lesions in our case study, the proposed method outperformed current practices which achieved a maximum accuracy of only 59%. Simulations and theory are presented to further elucidate this comparison as well as ascertain the utility of applying multivariate Gaussian spatial processes to GLCM objects.  相似文献   
172.
173.
Ethnicity‐related dating preferences among Asian American adolescents and the links between preferences (i.e., for a same‐ethnic dating partner) and ethnic identity centrality and regard, American identity centrality and regard, parent – adolescent closeness, and perceptions of discrimination were investigated. Data from 175 self‐identified Asian American high school students were collected yearly for four consecutive years. Higher levels of ethnic identity centrality and regard and parent – adolescent closeness averaged across four years were associated with preferring a same‐ethnic partner. Moreover, foreign‐born adolescents were less likely to prefer a same‐ethnic partner when they encountered a higher than average level of discrimination on any given year. Results highlight variability in the developmental and individual‐level factors that shape how adolescents navigate their dating relationships.  相似文献   
174.
ABSTRACT

Our everyday practices are increasingly mediated through online technologies, entailing the navigation and also oft-simultaneous creation of large quantities of information and communication data. The scale and types of activities being undertaken, the data that are being created and engaged with, and the possibilities for analysis, archiving and distribution are now so extensive that technical constructs are necessarily required as a way to manage, interpret and distribute these. These constructs include the platforms, the software, the codes and the algorithms. This paper explores the place of the algorithm in shaping and engaging with the contemporary everyday. It does this via an exploration of some particular instances of algorithmic sorting and presentation as well as considering some of the ways these contribute to shaping our everyday practices and understandings. In doing so, it raises questions about understandings of agency and power, shifting world views and our complex relationship with technologies.  相似文献   
175.
The distribution of the test statistics of homogeneity tests is often unknown, requiring the estimation of the critical values through Monte Carlo (MC) simulations. The computation of the critical values at low α, especially when the distribution of the statistics changes with the series length (sample cardinality), requires a considerable number of simulations to achieve a reasonable precision of the estimates (i.e. 106 simulations or more for each series length). If, in addition, the test requires a noteworthy computational effort, the estimation of the critical values may need unacceptably long runtimes.

To overcome the problem, the paper proposes a regression-based refinement of an initial MC estimate of the critical values, also allowing an approximation of the achieved improvement. Moreover, the paper presents an application of the method to two tests: SNHT (standard normal homogeneity test, widely used in climatology), and SNH2T (a version of SNHT showing a squared numerical complexity). For both, the paper reports the critical values for α ranging between 0.1 and 0.0001 (useful for the p-value estimation), and the series length ranging from 10 (widely adopted size in climatological change-point detection literature) to 70,000 elements (nearly the length of a daily data time series 200 years long), estimated with coefficients of variation within 0.22%. For SNHT, a comparison of our results with approximated, theoretically derived, critical values is also performed; we suggest adopting those values for the series exceeding 70,000 elements.  相似文献   

176.
We study the one-dimensional Ornstein–Uhlenbeck (OU) processes with marginal law given by tempered stable and tempered infinitely divisible distributions. We investigate the transition law between consecutive observations of these processes and evaluate the characteristic function of integrated tempered OU processes with a view toward practical applications. We then analyze how to draw a random sample from this class of processes by considering both the classical inverse transform algorithm and an acceptance–rejection method based on simulating a stable random sample. Using a maximum likelihood estimation method based on the fast Fourier transform, we empirically assess the simulation algorithm performance.  相似文献   
177.
Many commonly used statistical methods for data analysis or clinical trial design rely on incorrect assumptions or assume an over‐simplified framework that ignores important information. Such statistical practices may lead to incorrect conclusions about treatment effects or clinical trial designs that are impractical or that do not accurately reflect the investigator's goals. Bayesian nonparametric (BNP) models and methods are a very flexible new class of statistical tools that can overcome such limitations. This is because BNP models can accurately approximate any distribution or function and can accommodate a broad range of statistical problems, including density estimation, regression, survival analysis, graphical modeling, neural networks, classification, clustering, population models, forecasting and prediction, spatiotemporal models, and causal inference. This paper describes 3 illustrative applications of BNP methods, including a randomized clinical trial to compare treatments for intraoperative air leaks after pulmonary resection, estimating survival time with different multi‐stage chemotherapy regimes for acute leukemia, and evaluating joint effects of targeted treatment and an intermediate biological outcome on progression‐free survival time in prostate cancer.  相似文献   
178.
179.
An inquiry into child welfare protective services workers’ perceptions and experiences in the United States was conducted in order to examine their perceptions of crisis and crisis intervention, and the emotional impact of working with children who endured significant maltreatment. As there is presently little research that has explored these issues specifically from the point of view of the workers, a qualitative grounded theory approach was utilized. Four themes emerged from the data: workers perceived crisis as a result of biopsychosocial breakdown; workers routinely triage when faced with crises; workers are subject to vicarious traumatization; and workers’ personal lives are affected by their work. The findings add to an existing body of knowledge about secondary trauma in child welfare by providing information about the investigative workers’ subjective experience of it. This research adds a unique contribution to understanding workers’ subjective experience of crisis on the job, how it manifests, and whether they feel knowledgeable in the area of crisis intervention.  相似文献   
180.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号