cpi什么意思| 万兽之王是什么动物| 猫离家出走预示着什么| 为什么晚上不能剪指甲| 颈椎病看什么科| std是什么| 煮玉米加什么才会香甜| 怀字五行属什么| 桃皮绒是什么面料| 2000年是什么龙| 办健康证需要检查什么| 一个月没有来月经是什么原因| 什么是肾阴虚和肾阳虚| 娃娃鱼用什么呼吸| 清宫和无痛人流有什么区别| 手臂长痘痘是什么原因| 为什么手比脸白那么多| 补充公积金是什么意思| 沙发是什么头发| 感染hpv吃什么药| 磁共振是什么| 元阳是什么意思| 人走了说什么安慰的话| 每天早上起床头晕是什么原因| cm2是什么单位| 肠系膜多发淋巴结是什么意思| 硕的拼音是什么| 黄喉是什么动物身上的| 咱家是什么意思| 彩宝是什么| 男人阴茎硬不起来是什么原因| 平均血红蛋白量偏高是什么意思| 血压为什么晚上高| 梦见妹妹是什么意思| 什么样的眼睛| 颈椎病看什么科| 宫腔粘连是什么意思| 从来不吃窝边草是什么生肖| 婚前体检都检查什么| 艾滋病中期有什么症状| 卅什么意思| 蕾丝是什么意思| 洗葡萄用什么洗最干净| 烟花三月下扬州什么意思| 肌酐高吃什么水果好| 吃什么对脑血管好| 开塞露擦脸有什么效果| 发烧有什么症状| 孕妇耻骨疼是什么原因| 二拇指比大拇指长代表什么| 铋剂是什么药| 复方丹参片治什么病| 头昏吃什么药| pb是什么元素| 赑屃是什么意思| 18年是什么年| 柔肝是什么意思| 6月23日是什么日子| 头痛头晕吃什么药| 翻白眼是什么意思| 右侧上颌窦粘膜增厚是什么意思| 高知是什么意思| 蒂芙尼算什么档次| 血小板比积偏高是什么意思| 碧玉是什么玉| 癌变是什么意思| 补气血吃什么水果| 戴尾戒是什么意思| ug是什么意思| 激素六项什么时间查最好| 什么都不放的冬瓜清汤| 胃出血恢复期吃什么好| 青稞是什么东西| 玉树临风是什么生肖| 私定终身是什么意思| ctp是什么| 乏力没精神容易疲劳是什么原因| 左下腹是什么部位| 兑水是什么意思| 肛瘘是什么| 吃什么代谢快| 月经前腰疼的厉害是什么原因| 脑白质病变吃什么药| 乳糖不耐受喝什么牛奶| 淋巴细胞偏低是什么原因| 动漫是什么意思| 肺部占位性的病变指什么| 掉头发去医院挂什么科| 为什么空腹血糖比餐后血糖高| 夜明珠代表什么生肖| 什么药补肾最好| snidel是什么牌子| 嫖娼是什么| 什么的妈妈| 生化妊娠是什么意思| 吃亏是什么意思| 冰室是什么意思| 便秘是什么引起的| 孕妇不能吃什么| 水痘通过什么途径传染| 同型半胱氨酸高挂什么科| 一个句号是什么意思| 蓝颜是什么意思| 喝荷叶茶有什么好处和坏处| 转呼啦圈有什么好处| 肛周脓肿吃什么消炎药| broom是什么意思| 九月二十五是什么星座| 跑得最快的是什么生肖| 什么药治肠炎效果最好| 消防队属于什么编制| 文武双全是什么意思| 躁郁症吃什么药| 长期吃阿司匹林有什么副作用| 西瓜霜是什么做的| 人间蒸发是什么意思| 什么叫211大学| 乳酸杆菌阳性什么意思| 中性粒细胞计数偏高是什么意思| 血液是什么组织| 什么叫多重耐药菌| 什么对眼睛好| lop胎位是什么意思| 关节炎挂什么科| 肺动脉流的是什么血| 情节是什么| 裂变是什么意思| 为什么夏天容易掉头发| 宫腔内无回声区是什么意思| 导语是什么| 每天都做梦是什么原因| 小龙女叫什么名字| 子宫内膜9mm意味着什么| 维生素b5又叫什么| md是什么职位| 11月30号是什么星座| 八字伏吟是什么意思| 身体起水泡是什么病症| 吃什么下奶快下奶多| 摇花手是什么意思| 偶尔头晕是什么原因| 相恋纪念日送什么礼物| 为什么会得淋巴肿瘤| 夏天晚饭吃什么比较好| 13岁属什么| 小囊肿是什么病严重吗| 肿瘤最怕什么| 吃什么提神醒脑抗疲劳| 说话鼻音重是什么原因| 女儿红属于什么酒| 拖鞋什么材质的好| 心率低吃什么药最好| 大口鱼是什么鱼| 蝼蛄是什么动物| 杀青原指什么| 喉咙痒吃什么药好| 常州为什么叫龙城| 大腿疼是什么原因| 猝死是什么意思| lo娘是什么意思| 6月份有什么节假日| 红牛什么时候喝效果好| 什么是孝| 心慌吃什么药效果好| 泡茶用什么杯子最好| theme什么意思| UDCA是什么药| 过敏性鼻炎用什么药效果最好| 医保断了一个月有什么影响| 脸上经常长痘痘是什么原因| 黄斑病变是什么引起的| 拿什么拯救你我的爱人演员表| 钮钴禄什么意思| 为什么会梦见前男友| 心尖尖是什么意思| 脸上脱皮是什么原因| 请结合临床是什么意思| 胸膜炎吃什么消炎药| 正月是什么意思| 胸骨疼挂什么科| rag是什么| 夏朝前面是什么朝代| 乳头瘤有什么症状| 为什么手上会起小水泡| 做月子要注意什么| 什么叫情商| 牒是什么意思| 父亲节应该送什么| 怀孕不能吃什么| 就请你给我多一点点时间是什么歌| 血压和血糖有什么关系| 腹股沟淋巴结肿大挂什么科| 即什么意思| 什么的花纹| 所以然什么意思| 梦见网鱼是什么征兆| 甲状腺结节什么引起的| dha宝宝什么时候吃最好| 平均血小板体积偏低是什么意思| 观音土为什么能吃| 端游什么意思| 这是什么品牌| 艾叶泡水喝有什么功效| 子不问卜自惹祸殃什么意思| 为什么肚子疼| 2.18是什么星座| 什么鼻子好看| 二尖瓣关闭不全是什么意思| 皲裂什么意思| 儿童坐动车需要带什么证件| 痛什么什么痛| 梦见和老公吵架是什么意思| 尿多尿频是什么原因| 年少有为什么意思| 巳蛇五行属什么| 干咳吃什么药止咳效果好| 湖南有什么好玩的| 用牙膏洗脸有什么好处和坏处| ncu病房是什么意思| 女人眼睛干涩吃什么药| 梦见黑蛇是什么预兆| 梦见自己搬家是什么意思| 拔罐红色是什么原因| 5月27是什么星座| 荨麻疹不能吃什么| 羊肉汤放什么调料| 爱的本质是什么| 中心句是什么意思| 跖围是什么意思| 盆腔炎吃什么药好| 包皮炎用什么药最有效| 检查视力挂什么科| 什么食物养胃又治胃病| 糖化高是什么意思| 5月5日什么星座| 六月二十九日是什么星座| 咳嗽有白痰吃什么药| 胚包括什么| 手脱皮用什么药膏| bnp是什么检查| 前位子宫是什么意思| 脚掌心发热是什么原因| 喝苦丁茶有什么好处| 吃什么补脑| 弓耳念什么| nokia是什么牌子的手机| 五月五日什么星座| 睫毛炎有什么症状| 他说风雨中这点痛算什么| 脸部麻木是什么的前兆| 鹿晗的原名是什么| 小孩缺锌有什么症状| 肌肉型肥胖是什么意思| 怕冷吃什么药| 86年属虎是什么命| 浑身酸疼是什么原因| 为什么一洗澡月经就没了| 福利姬什么意思| 硕字五行属什么| 大便有血是什么原因男性| 吃葡萄对身体有什么好处| 联系是什么意思| 皮肤长癣是什么原因引起的| 倚老卖老什么意思| 百度Jump to content

军情锐评:服役63年仍作战:美B

From Wikipedia, the free encyclopedia
百度 贷款方面,按央行基准利率首付30%三年期计算,首付万元左右(包含车款、上牌、保险、购置税和担保金等),月供万元左右。

Parallel metaheuristic is a class of techniques that are capable of reducing both the numerical effort[clarification needed] and the run time of a metaheuristic. To this end, concepts and technologies from the field of parallelism in computer science are used to enhance and even completely modify the behavior of existing metaheuristics. Just as it exists a long list of metaheuristics like evolutionary algorithms, particle swarm, ant colony optimization, simulated annealing, etc. it also exists a large set of different techniques strongly or loosely based in these ones, whose behavior encompasses the multiple parallel execution of algorithm components that cooperate in some way to solve a problem on a given parallel hardware platform.

Background

[edit]
An example of different implementations of the same PSO metaheuristic model.

In practice, optimization (and searching, and learning) problems are often NP-hard, complex, and time-consuming. Two major approaches are traditionally used to tackle these problems: exact methods and metaheuristics.[disputeddiscuss] Exact methods allow to find exact solutions but are often impractical as they are extremely time-consuming for real-world problems (large dimension, hardly constrained, multimodal, time-varying, epistatic problems). Conversely, metaheuristics provide sub-optimal (sometimes optimal) solutions in a reasonable time. Thus, metaheuristics usually allow to meet the resolution delays imposed in the industrial field as well as they allow to study general problem classes instead that particular problem instances. In general, many of the best performing techniques in precision and effort to solve complex and real-world problems are metaheuristics. Their fields of application range from combinatorial optimization, bioinformatics, and telecommunications to economics, software engineering, etc. These fields are full of many tasks needing fast solutions of high quality. See [1] for more details on complex applications.

Metaheuristics fall in two categories: trajectory-based metaheuristics and population-based metaheuristics. The main difference of these two kind of methods relies in the number of tentative solutions used in each step of the (iterative) algorithm. A trajectory-based technique starts with a single initial solution and, at each step of the search, the current solution is replaced by another (often the best) solution found in its neighborhood. It is usual that trajectory-based metaheuristics allow to quickly find a locally optimal solution, and so they are called exploitation-oriented methods promoting intensification in the search space. On the other hand, population-based algorithms make use of a population of solutions. The initial population is in this case randomly generated (or created with a greedy algorithm), and then enhanced through an iterative process. At each generation of the process, the whole population (or a part of it) is replaced by newly generated individuals (often the best ones). These techniques are called exploration-oriented methods, since their main ability resides in the diversification in the search space.

Most basic metaheuristics are sequential. Although their utilization allows to significantly reduce the temporal complexity of the search process, this latter remains high for real-world problems arising in both academic and industrial domains. Therefore, parallelism comes as a natural way not to only reduce the search time, but also to improve the quality of the provided solutions.

For a comprehensive discussion on how parallelism can be mixed with metaheuristics see [2].

Parallel trajectory-based metaheuristics

[edit]

Metaheuristics for solving optimization problems could be viewed as walks through neighborhoods tracing search trajectories through the solution domains of the problem at hands:

Algorithm: Sequential trajectory-based general pseudo-code
    Generate(s(0));        // Initial solution
    t := 0;                // Numerical step
    while not Termination Criterion(s(t)) do
        s′(t) := SelectMove(s(t)); // Exploration of the neighborhood
        if AcceptMove(s′(t)) then
            s(t) := ApplyMove(s′(t));
            t := t + 1;
    endwhile

Walks are performed by iterative procedures that allow moving from one solution to another one in the solution space (see the above algorithm). This kind of metaheuristics perform the moves in the neighborhood of the current solution, i.e., they have a perturbative nature. The walks start from a solution randomly generated or obtained from another optimization algorithm. At each iteration, the current solution is replaced by another one selected from the set of its neighboring candidates. The search process is stopped when a given condition is satisfied (a maximum number of generation, find a solution with a target quality, stuck for a given time, . . . ).

A powerful way to achieve high computational efficiency with trajectory-based methods is the use of parallelism. Different parallel models have been proposed for trajectory-based metaheuristics, and three of them are commonly used in the literature: the parallel multi-start model, the parallel exploration and evaluation of the neighborhood (or parallel moves model), and the parallel evaluation of a single solution (or move acceleration model):

  • Parallel multi-start model: It consists in simultaneously launching several trajectory-based methods for computing better and robust solutions. They may be heterogeneous or homogeneous, independent or cooperative, start from the same or different solution(s), and configured with the same or different parameters.
  • Parallel moves model: It is a low-level master-slave model that does not alter the behavior of the heuristic. A sequential search would compute the same result but slower. At the beginning of each iteration, the master duplicates the current solution between distributed nodes. Each one separately manages their candidate/solution and the results are returned to the master.
  • Move acceleration model: The quality of each move is evaluated in a parallel centralized way. That model is particularly interesting when the evaluation function can be itself parallelized as it is CPU time-consuming and/or I/O intensive. In that case, the function can be viewed as an aggregation of a certain number of partial functions[clarification needed] that can be run in parallel.

Parallel population-based metaheuristics

[edit]

Population-based metaheuristic are stochastic search techniques that have been successfully applied in many real and complex applications (epistatic, multimodal, multi-objective, and highly constrained problems). A population-based algorithm is an iterative technique that applies stochastic operators on a pool of individuals: the population (see the algorithm below). Every individual in the population is the encoded version of a tentative solution. An evaluation function associates a fitness value to every individual indicating its suitability to the problem. Iteratively, the probabilistic application of variation operators on selected individuals guides the population to tentative solutions of higher quality. The most well-known metaheuristic families based on the manipulation of a population of solutions are evolutionary algorithms (EAs), ant colony optimization (ACO), particle swarm optimization (PSO), scatter search (SS), differential evolution (DE), and estimation distribution algorithms (EDA).

Algorithm: Sequential population-based metaheuristic pseudo-code
    Generate(P(0)); // Initial population
    t := 0;          // Numerical step
    while not Termination Criterion(P(t)) do
        Evaluate(P(t)); // Evaluation of the population
        P′′(t) := Apply Variation Operators(P′(t)); // Generation of new solutions
        P(t + 1) := Replace(P(t), P′′(t)); // Building the next population
        t := t + 1;
    endwhile

For non-trivial problems, executing the reproductive cycle of a simple population-based method on long individuals and/or large populations usually requires high computational resources. In general, evaluating a fitness function for every individual is frequently the most costly operation of this algorithm. Consequently, a variety of algorithmic issues are being studied to design efficient techniques. These issues usually consist of defining new operators, hybrid algorithms, parallel models, and so on.

Parallelism arises naturally when dealing with populations, since each of the individuals belonging to it is an independent unit (at least according to the Pittsburg style, although there are other approaches like the Michigan one which do not consider the individual as independent units). Indeed, the performance of population-based algorithms is often improved when running in parallel. Two parallelizing strategies are specially focused on population-based algorithms:

  1. Parallelization of computations, in which the operations commonly applied to each of the individuals are performed in parallel, and
  2. Parallelization of population, in which the population is split in different parts that can be simply exchanged or evolved separately, and then joined later.

In the beginning of the parallelization history of these algorithms, the well-known master-slave (also known as global parallelization or farming) method was used. In this approach, a central processor performs the selection operations while the associated slave processors (workers) run the variation operator and the evaluation of the fitness function. This algorithm has the same behavior as the sequential one, although its computational efficiency is improved, especially for time-consuming objective functions. On the other hand, many researchers use a pool of processors to speed up the execution of a sequential algorithm, just because independent runs can be made more rapidly by using several processors than by using a single one. In this case, no interaction at all exists between the independent runs.

However, actually most parallel population-based techniques found in the literature utilize some kind of spatial disposition for the individuals, and then parallelize the resulting chunks in a pool of processors. Among the most widely known types of structured metaheuristics, the distributed (or coarse grain) and cellular (or fine grain) algorithms are very popular optimization procedures.

In the case of distributed ones, the population is partitioned in a set of subpopulations (islands) in which isolated serial algorithms are executed. Sparse exchanges of individuals are performed among these islands with the goal of introducing some diversity into the subpopulations, thus preventing search of getting stuck in local optima. In order to design a distributed metaheuristic, we[who?] must take several decisions. Among them, a chief decision is to determine the migration policy: topology (logical links between the islands), migration rate (number of individuals that undergo migration in every exchange), migration frequency (number of steps in every subpopulation between two successive exchanges), and the selection/replacement of the migrants.

In the case of a cellular method, the concept of neighborhood is introduced, so that an individual may only interact with its nearby neighbors in the breeding loop. The overlapped small neighborhood in the algorithm helps in exploring the search space because a slow diffusion of solutions through the population provides a kind of exploration, while exploitation takes place inside each neighborhood. See [3] for more information on cellular Genetic Algorithms and related models.

Also, hybrid models are being proposed in which a two-level approach of parallelization is undertaken. In general, the higher level for parallelization is a coarse-grained implementation and the basic island performs a cellular, a master-slave method or even another distributed one.

See also

[edit]

References

[edit]
  • G. Luque, E. Alba, Parallel Genetic Algorithms. Theory and Real World Applications, Springer-Verlag, ISBN 978-3-642-22083-8, July 2011
  • Alba E., Blum C., Isasi P., León C. Gómez J.A. (eds.), Optimization Techniques for Solving Complex Problems, Wiley, ISBN 978-0-470-29332-4, 2009
  • E. Alba, B. Dorronsoro, Cellular Genetic Algorithms, Springer-Verlag, ISBN 978-0-387-77609-5, 2008
  • N. Nedjah, E. Alba, L. de Macedo Mourelle, Parallel Evolutionary Computations, Springer-Verlag, ISBN 3-540-32837-8, 2006
  • E. Alba, Parallel Metaheuristics: A New Class of Algorithms, Wiley, ISBN 0-471-67806-6, July 2005
  • MALLBA
  • JGDS
  • DEME
  • xxGA
  • PSALHE-EA
  • Paradiseo
我们到底什么关系 bi是什么 结肠是什么病 冬枣为什么叫冬枣 阳上人是什么意思
尿血是什么病 禳是什么意思 什么是兼职 牙周炎吃什么药最好 鼻息肉是什么症状
会来事是什么意思 警察两杠一星是什么级别 sand是什么颜色 早上跑步有什么好处 maxrieny是什么品牌
当归有什么功效 炖羊肉汤放什么调料 996是什么 舌苔发黄是什么症状 后脑勺长白头发是什么原因
黄猫来家里有什么预兆hcv8jop2ns7r.cn 14岁属什么sscsqa.com 吃南瓜子有什么好处hcv9jop6ns7r.cn 尿道炎是什么症状hcv9jop7ns1r.cn 小孩血糖高有什么症状youbangsi.com
hcg稀释是什么意思hcv8jop8ns1r.cn 维生素c什么时候吃最好hcv9jop1ns6r.cn 内消瘰疬丸主治什么病hcv8jop5ns7r.cn 甲骨文是写在什么上面的hcv8jop4ns2r.cn 小孩睡觉流口水是什么原因hcv8jop4ns8r.cn
什么是借读生dajiketang.com 身体缺糖有什么症状hcv8jop8ns5r.cn 大面念什么hcv9jop2ns0r.cn 石龙子吃什么hcv8jop8ns4r.cn 双鱼座跟什么星座最配hcv8jop9ns5r.cn
风属于五行属什么hcv9jop7ns0r.cn 梅肉是什么肉hcv8jop5ns2r.cn 呃呃是什么意思hcv7jop6ns5r.cn 房间消毒杀菌用什么好hcv9jop0ns7r.cn 皮肤黑的人穿什么颜色的衣服显白hcv7jop9ns2r.cn
百度