您现在的位置: 首页> 研究主题> probability

probability

probability的相关文献在1958年到2022年内共计276篇,主要集中在数学、肿瘤学、无线电电子学、电信技术 等领域,其中期刊论文276篇、相关期刊99种,包括中国科学、中国化学快报:英文版、美国运筹学期刊(英文)等; probability的相关文献由631位作者贡献,包括Fokrul Alom Mazarbhuiya、G. Nanjundan、Guang Wu等。

probability—发文量

期刊论文>

论文:276 占比:100.00%

总计:276篇

probability—发文趋势图

probability

-研究学者

  • Fokrul Alom Mazarbhuiya
  • G. Nanjundan
  • Guang Wu
  • Hong Zhou
  • Hongyun Wang
  • Jing Li
  • Pham Van Khanh
  • 本刊编辑部
  • Antony J. Bourdillon
  • Arend Niehaus
  • 期刊论文

搜索

排序:

年份

期刊

关键词

    • Ebru HARMANDAR
    • 摘要: Fethiye is an important region located in the eastern part of the Mediterranean Basin.This region which is the most active part of the south-western Anatolia extensional tectonic regime,has been effected by earthquakes,submarine landslides and tsunamis throughout the history due to the existence of the complex active plate boundary zone.The active area has been exposed to ground motion that had the potential to damage vulnerable structures.Therefore,a detailed assessment of seismic hazard is necessary for the prevention of potential damage.In this context,probabilistic seismic hazard analysis is performed by R-CRISIS-18.3 using the refined parameters computed from Seismic Hazard Harmonization for Europe(SHARE)project.Spatial distribution of spectral acceleration at T=0.2 s and T=1.0 s for the earthquake levels that corresponds to the average return period of 72,475 and 2475 years is calculated.Hazard curves for the central region of Fethiye district are generated.The results are discussed and compared with the values obtained from the Revision of Turkish Seismic Hazard Map Project(UDAP-C-13-06).These local results of probabilistic seismic hazard analysis will provide the basis for the preparation of seismic risk maps as future work.
    • Thomas Beatty; Nicole Legge
    • 摘要: The finite field Fq has q elements, where q = pk for prime p and k∈N. Then Fq[x] is a unique factorization domain and its polynomials can be bijectively associated with their unique (up to order) factorizations into irreducibles. Such a factorization for a polynomial of degree n can be viewed as conforming to a specific template if we agree that factors with higher degree will be written before those with lower degree, and factors of equal degree can be written in any order. For example, a polynomial f(x) of degree n may factor into irreducibles and be written as (a)(b)(c), where deg a ≥ deg b ≥deg c. Clearly, the various partitions of n correspond to the templates available for these canonical factorizations and we identify the templates with the possible partitions. So if f(x) is itself irreducible over Fq, it would belong to the template [n], and if f(x) split over Fq, it would belong to the template [n] Our goal is to calculate the cardinalities of the sets of polynomials corresponding to available templates for general q and n. With this information, we characterize the associated probabilities that a randomly selected member of Fq[x] belongs to a given template. Software to facilitate the investigation of various cases is available upon request from the authors.
    • Zhiyong Zheng; Kun Tian
    • 摘要: The main purpose of this paper is to give an extension on learning with errors problem (LWE) based cryptosystem about the probability of decryption error with more general disturbance. In the first section, we introduce the LWE cryptosystem with its application and some previous research results. Then we give a more precise estimation probability of decryption error based on independent identical Gaussian disturbances and any general independent identical disturbances. This upper bound probability could be closed to 0 if we choose applicable parameters. It means that the probability of decryption error for the cryptosystem could be sufficiently small. So we verify our core result that the LWE-based cryptosystem could have high security.
    • Ai-Gen Xie; Yi-Fan Liu; Hong-Jie Dong
    • 摘要: This study investigates two secondary electron emission(SEE)models for photoelectric energy distribution curves f(E_(ph),hγ),B,E_(mean),absolute quantum efficiency(AQE),and the mean escape depth of photo-emitted electronsλof metals.The proposed models are developed from the density of states and the theories of photo-emission in the vacuum ultraviolet and SEE,where B is the mean probability that an internal photo-emitted electron escapes into vacuum upon reaching the emission surface of the metal,and E_(mean)is the mean energy of photo-emitted electrons measured from vacuum.The formulas for f(E_(ph),hγ),B,λ,E_(mean),and AQE that were obtained were shown to be correct for the cases of Au at hγ=8.1–11.6 eV,Ni at hγ=9.2–11.6 eV,and Cu at hγ=7.7–11.6 eV.The photoelectric cross sections(PCS)calculated here are analyzed,and it was confirmed that the calculated PCS of the electrons in the conduction band of Au at hγ=8.1–11.6eV,Ni at hγ=9.2–11.6 eV,and Cu at hγ=7.7–11.6 eV are correct.
    • John B. Andelin
    • 摘要: The theory of evolution was advanced by Darwin in 1859, prior to Mendel’s experiments demonstrating the particulate nature of inheritance. The modern synthesis was formulated in the early 1940s, well before the concept of coded information was understood. This paper outlines four mathematical challenges to the modern synthesis, which are based on current understanding of the proposed mechanisms of evolutionary change within the constraints of experimental molecular biology.
    • Motohisa Osaka
    • 摘要: Suppose that when money or coins are placed in a box containing an arbitrary number of prizes of several different types, one of each type of prize will appear alone each time. Is the probability that all types of prizes will be together for the first time maximized when the majority of prizes are removed from the box? Simulations based on probability theory show that this probability reaches its maximum value in a surprisingly small number of trials, contrary to expectations. This will help us understand not only mathematical phenomena, but also real-world phenomena. Phenomena that do not occur without several substances or conditions seem unlikely to occur, but the results of this study suggest that, contrary to expectations, they are surprisingly likely to occur probabilistically.
    • Romney B.Duffey
    • 摘要: We need to predict the probability of unprecedented flooding of lands and coastlines due to unexpected storms,overflowing rivers,hurricanes,tidal surges and dam failures.This paper addresses new record floods that exceed all prior“historic”levels and are invariably due to extreme or severe weather and/or unexpected precipitation,defeating barriers and causing extensive power system outages.Given their inherently low occurrence,the probabilities of new(rare)record floods are treated as random outcomes and independent events using classical statistical mechanics and related hypergeometric sampling.This analysis straightforwardly replaces tuning or fitting to“normal”precipitation,regular tides and prior flood data and the traditional use of multi-parameter extreme value distributions(EVDs)used for weather-induced flood forecasting and estimating“return periods”.The approach is not reliant on geographic computer models,meteorological forecasting,published“flood zone”charts,or hydrological techniques and images.We illustrate the universal applicability of this Bayesian-style approach of solely addressing new records for a wide range of specific flooding case studies for rivers,major hurricanes,quasi-periodic coastal tides,and dam failures.The quantitative link is shown between extreme event extent and power outage duration,and the results impact disaster resilience,infrastructure vulnerability and emergency preparedness measures.
    • Dorian Conger; Ivan Vrbanic; Ivica Basic; Kenneth J.Elsea
    • 摘要: The paper discusses the framework for a risk-informed root cause analysis process.Such process enables scaling of the analysis performed based on the risk associated with the undesired event or condition,thereby creating tiers of analysis where the greater the risk,the more sophisticated the analysis.In a risk-informed root cause analysis process,a situation is normally not analyzed at a level less than what actually occurred.However,a situation may be investigated as though the consequence were greater than actually happened,especially if only slight differences in circumstances could result in a significantly higher consequence.While operational events or safety issues are normally expected to result only with negligible or marginal actual consequences,many of those would actually have certain potential to develop or propagate into catastrophic events.This potential can be expressed qualitatively or quantitatively.Risk-informing of root cause analysis relies on mapping the event or safety issue into a risk matrix which,traditionally,is a two-dimensional probability-consequence matrix.A new concept employed in the risk matrix for root cause analysis is that,while the probability reflects the observed or expected range of values(retaining,thus,its“traditional”meaning),the consequence reflects not only the observed or materialized impact(such as failure of equipment)but,also,its potential to propagate or develop into highly undesirable final state.The paper presents main elements of risk-informed root cause analysis process and discusses qualitative and quantitative aspects and approaches to determination of risk significance of operational events or safety issues.
    • 符传博; 丹利
    • 摘要: The number of haze days and daily visibility data for 543 stations in China were used to define the probabilities of four grades of haze days:slight haze(SLH)days;light haze(LIH)days;moderate haze(MOH)days;and severe haze(SEH)days.The change trends of the four grades of haze were investigated and the following results were obtained.The highest probability was obtained for SLH days(95.138%),which showed a decreasing trend over the last54 years with the fastest rate of decrease of-0.903%·(10 years)-1 and a trend coefficient of-0.699,passing the 99.9%confidence level.The probabilities of LIH and MOH days increased steadily,whereas the probability of SEH days showed a slight downward trend during that period.The increasing probability of SLH days was mainly distributed to the east of 105°E and the south of 42°N and the highest value of the trend coefficient was located in the Pearl River Delta and Yangtze River Delta regions.The increasing probability of LIH days was mainly distributed in eastern China and the southeastern coastal region.The probabilities of MOH and SEH days was similar to the probability of LIH days.An analysis of the four grades of haze days in cities with different sizes suggested that the probability of SLH days in large cities and medium cities clearly decreased during the last 54 years.However,the probabilities of LIH days was1.5°from a large or medium city showed an increasing trend and reached 100%after 1990;the probability of the other three grades was small and decreased significantly.
    • WANG ShuSheng12; ZHAO ZiLiang2; XU YuQian12; LI XiaoLong12
    • 摘要: Nowadays,more and more interdisciplinary approaches have been applied in urban planning,such as computer,mathematics and geography.However,the sophisticated mathematical methods such as transition matrix,joint-count,Bayes rules and Markov chain have not been deeply utilized in urban land use analysis.Furthermore,the newborn parcel-level urban land use data method has just been tested in a few cases and has not yet been adopted in ancient city area.Based on the above,this paper uses a series of mathematical methods and parcel-level urban land use data for quantification study in the Xi’an city wall area.Digitizing the maps compiled in 1935,1963,1995,2007 and 2017 of the study area leads to the acquisition of the parcel-level urban land use data concerning the following four categories:Residential(R),Service(S),Culture(C)and Other(O).Then five parcel maps of different times will be built up.Through a series of mathematical analysis,the result shows that urban land use change in this area has three kinds of characteristics.For urban land use change speed,the period between 1995 and 2007 is the fastest while the period from 1963 to 1995 is the slowest.For the transition of urban land use,R and S are the main categories,and transition from R to S is the dominant change.Besides,dominated neighbors have positive effects on their transition.C is consistently increasing and has a clustering distribution.For the influence of other factors such as environment and policy,C is a special category that has the highest sensitivity to policies.The result clearly explains the data from the research into the evolution of urban land use in the study area work as a powerful support for land use planning and policy.The mathematical methods would provide a new perspective for the study in ancient Chinese cities.
  • 查看更多

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号