A highly parallelized and vectorized implementation of Max-Min Ant System on Intel Xeon Phi

Huw Lloyd, Martyn Amos

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Citations (Scopus)

Abstract

The increasing trend in processor design towards many-core architectures with wide vector processing units is largely motivated by the fact that single core performance has hit a “power wall”, meaning that performance gains are currently achievable only through increasingly parallel and vectorized execution models. Consequently, applications can only exploit the full performance of modern processors if they achieve high parallel and vector efficiencies. In this paper, we illustrate how this might be achieved for the well-established Ant Colony Optimization metaheuristic. We describe a highly parallel and vectorized variant of the Max-Min Ant System algorithm applied to the Traveling Salesman Problem, and present two novel vectorized algorithms for selecting cities during the tour construction phase. We present experimental results from an implementation on the Intel® Xeon Phi™ platform, which show that very high parallel and vector efficiencies are achieved, and significant speedups are obtained compared to both the reference serial implementation and the previous best Xeon Phi implementation available in the literature.
Original languageEnglish
Title of host publicationIEEE Symposium Series on Computational Intelligence (SSCI)
PublisherIEEE
ISBN (Electronic)978-1-5090-4240-1
ISBN (Print)978-1-5090-4241-8
DOIs
Publication statusPublished - Dec 2016
EventIEEE Symposium Series on Computational Intelligence - Athens, Greece
Duration: 6 Dec 20169 Dec 2016

Conference

ConferenceIEEE Symposium Series on Computational Intelligence
Abbreviated titleSSCI 2016
CountryGreece
CityAthens
Period6/12/169/12/16

Fingerprint Dive into the research topics of 'A highly parallelized and vectorized implementation of Max-Min Ant System on Intel Xeon Phi'. Together they form a unique fingerprint.

Cite this