全文获取类型
收费全文 | 23085篇 |
免费 | 918篇 |
国内免费 | 340篇 |
专业分类
管理学 | 4549篇 |
劳动科学 | 2篇 |
民族学 | 76篇 |
人才学 | 3篇 |
人口学 | 390篇 |
丛书文集 | 1043篇 |
理论方法论 | 731篇 |
综合类 | 10345篇 |
社会学 | 1021篇 |
统计学 | 6183篇 |
出版年
2025年 | 15篇 |
2024年 | 210篇 |
2023年 | 301篇 |
2022年 | 336篇 |
2021年 | 376篇 |
2020年 | 664篇 |
2019年 | 753篇 |
2018年 | 776篇 |
2017年 | 923篇 |
2016年 | 804篇 |
2015年 | 800篇 |
2014年 | 1230篇 |
2013年 | 2724篇 |
2012年 | 1725篇 |
2011年 | 1405篇 |
2010年 | 1164篇 |
2009年 | 1128篇 |
2008年 | 1222篇 |
2007年 | 1196篇 |
2006年 | 1144篇 |
2005年 | 1016篇 |
2004年 | 840篇 |
2003年 | 723篇 |
2002年 | 608篇 |
2001年 | 541篇 |
2000年 | 324篇 |
1999年 | 264篇 |
1998年 | 147篇 |
1997年 | 143篇 |
1996年 | 110篇 |
1995年 | 100篇 |
1994年 | 98篇 |
1993年 | 77篇 |
1992年 | 67篇 |
1991年 | 65篇 |
1990年 | 58篇 |
1989年 | 40篇 |
1988年 | 45篇 |
1987年 | 30篇 |
1986年 | 26篇 |
1985年 | 25篇 |
1984年 | 25篇 |
1983年 | 25篇 |
1982年 | 21篇 |
1981年 | 17篇 |
1980年 | 2篇 |
1979年 | 5篇 |
1978年 | 2篇 |
1977年 | 2篇 |
1975年 | 1篇 |
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
11.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables. 相似文献
12.
We discuss the issue of using benchmark doses for quantifying (excess) risk associated with exposure to environmental hazards. The paradigm of low-dose risk estimation in dose-response modeling is used as the primary application scenario. Emphasis is placed on making simultaneous inferences on benchmark doses when data are in the form of proportions, although the concepts translate easily to other forms of outcome data. 相似文献
13.
William S. Pease 《Risk analysis》1992,12(2):253-265
The extent of carcinogen regulation under existing U.S. environmental statutes is assessed by developing measures of the scope and stringency of regulation. While concern about cancer risk has played an important political role in obtaining support for pollution control programs, it has not provided the predominant rationale for most regulatory actions taken to date. Less than 20% of all standards established to limit concentrations of chemicals in various media address carcinogens. Restrictions on chemical use are more frequently based on concerns about noncancer human health or ecological effects. Of the chemicals in commercial use which have been identified as potential human carcinogens on the basis of rodent bioassays, only a small proportion are regulated. There is an inverse relationship between the scope of regulatory coverage and the stringency of regulatory requirements: the largest percentages of identified carcinogens are affected by the least stringent requirements, such as information disclosure. Standards based on de minimis cancer risk levels have been established for only 10% of identified carcinogens and are restricted to one medium: water. Complete bans on use have affected very few chemicals. The general role that carcinogenicity now plays in the regulatory process is not dramatically different from that of other adverse human health effects: if a substance is identified as a hazard, it may eventually be subject to economically achievable and technically feasible restrictions. 相似文献
14.
Peter J. Robinson 《Risk analysis》1992,12(1):139-148
Because of the inherent complexity of biological systems, there is often a choice between a number of apparently equally applicable physiologically based models to describe uptake and metabolism processes in toxicology or risk assessment. These models may fit the particular data sets of interest equally well, but may give quite different parameter estimates or predictions under different (extrapolated) conditions. Such competing models can be discriminated by a number of methods, including potential refutation by means of strategic experiments, and their ability to suitably incorporate all relevant physiological processes. For illustration, three currently used models for steady-state hepatic elimination--the venous equilibration model, the parallel tube model, and the distributed sinusoidal perfusion model--are reviewed and compared with particular reference to their application in the area of risk assessment. The ability of each of the models to describe and incorporate such physiological processes as protein binding, precursor-metabolite relations and hepatic zones of elimination, capillary recruitment, capillary heterogeneity, and intrahepatic shunting is discussed. Differences between the models in hepatic parameter estimation, extrapolation to different conditions, and interspecies scaling are discussed, and criteria for choosing one model over the others are presented. In this case, the distributed model provides the most general framework for describing physiological processes taking place in the liver, and has so far not been experimentally refuted, as have the other two models. These simpler models may, however, provide useful bounds on parameter estimates and on extrapolations and risk assessments. 相似文献
15.
A survey was conducted of approximately 200 Asian Indian Americans and 200 other residents of New Jersey in order to understand the risk management priorities that they want government to have. We found that Asian Indian Americans, especially younger women, focused on personal/family risks, such as alcohol and drug abuse, sexual abuse, and domestic violence. The New Jersey comparison group, in contrast, placed war/terrorism and loss of health care services and insurance at the top of their priorities for government. These results suggest stressful acculturation-related issues within the Asian Indian community. Both populations want more risk management from government than they believe government is currently providing. Respondents who wanted more from government tended to dread the risk, be fearful of the consequences, trust government, and have a feeling of personal efficacy. Within the Asian Indian American sample, wide variations were observed by language spoken at home and religious affiliation. Notably, Muslims and Hindi language speakers tended not to trust government and hence wanted less government involvement. This study supports our call for studies of recent migrant populations and Johnson's for testing ethnic identity and acculturation as factors in risk judgments. 相似文献
16.
The authors consider the optimal design of sampling schedules for binary sequence data. They propose an approach which allows a variety of goals to be reflected in the utility function by including deterministic sampling cost, a term related to prediction, and if relevant, a term related to learning about a treatment effect To this end, they use a nonparametric probability model relying on a minimal number of assumptions. They show how their assumption of partial exchangeability for the binary sequence of data allows the sampling distribution to be written as a mixture of homogeneous Markov chains of order k. The implementation follows the approach of Quintana & Müller (2004), which uses a Dirichlet process prior for the mixture. 相似文献
17.
Andris Abakuks 《Journal of the Royal Statistical Society. Series A, (Statistics in Society)》2007,170(3):841-850
Summary. In New Testament studies, the synoptic problem is concerned with the relationships between the gospels of Matthew, Mark and Luke. In an earlier paper a careful specification in probabilistic terms was set up of Honoré's triple-link model. In the present paper, a modification of Honoré's model is proposed. As previously, counts of the numbers of verbal agreements between the gospels are examined to investigate which of the possible triple-link models appears to give the best fit to the data, but now using the modified version of the model and additional sets of data. 相似文献
18.
The authors propose graphical and numerical methods for checking the adequacy of the logistic regression model for matched case‐control data. Their approach is based on the cumulative sum of residuals over the covariate or linear predictor. Under the assumed model, the cumulative residual process converges weakly to a centered Gaussian limit whose distribution can be approximated via computer simulation. The observed cumulative residual pattern can then be compared both visually and analytically to a certain number of simulated realizations of the approximate limiting process under the null hypothesis. The proposed techniques allow one to check the functional form of each covariate, the logistic link function as well as the overall model adequacy. The authors assess the performance of the proposed methods through simulation studies and illustrate them using data from a cardiovascular study. 相似文献
19.
It is often of interest to find the maximum or near maxima among a set of vector‐valued parameters in a statistical model; in the case of disease mapping, for example, these correspond to relative‐risk “hotspots” where public‐health intervention may be needed. The general problem is one of estimating nonlinear functions of the ensemble of relative risks, but biased estimates result if posterior means are simply substituted into these nonlinear functions. The authors obtain better estimates of extrema from a new, weighted ranks squared error loss function. The derivation of these Bayes estimators assumes a hidden‐Markov random‐field model for relative risks, and their behaviour is illustrated with real and simulated data. 相似文献
20.
THOMAS RICHARDSON 《Scandinavian Journal of Statistics》2003,30(1):145-157
We consider acyclic directed mixed graphs, in which directed edges ( x → y ) and bi-directed edges ( x ↔ y ) may occur. A simple extension of Pearl's d -separation criterion, called m -separation, is applied to these graphs. We introduce a local Markov property which is equivalent to the global property resulting from the m -separation criterion for arbitrary distributions. 相似文献