Metaheuristic Methods and Their Applications Lin-Yu Tseng ( 曾怜玉 ) Institute of Networking and Multimedia Department of Computer Science and Engineering.

Slides:



Advertisements
Présentations similaires
Primary French Presentation 2 Saying How You Are.
Advertisements

Laboratoire des outils informatiques pour la conception et la production en mécanique (LICP) ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE 1 Petri nets for.
Making PowerPoint Slides Avoiding the Pitfalls of Bad Slides.
PERFORMANCE One important issue in networking is the performance of the network—how good is it? We discuss quality of service, an overall measurement.
10/10/ Finite Difference Method Major: All Engineering Majors Authors: Autar Kaw, Charlie Barker
An Introduction To Two – Port Networks The University of Tennessee Electrical and Computer Engineering Knoxville, TN wlg.
Traffic Sign Recognition Jacob Carlson Sean St. Onge Advisor: Dr. Thomas L. Stewart.
 Components have ratings  Ratings can be Voltage, Current or Power (Volts, Amps or Watts  If a Current of Power rating is exceeded the component overheats.
IP Multicast Text available on
DIFFERENTIATION NO ONE CAN COMPEL A STUDENT TO LEARN. (MEIRIEU)
Template Provided By Genigraphics – Replace This Text With Your Title John Smith, MD 1 ; Jane Doe, PhD 2 ; Frederick Smith, MD, PhD 1,2 1.
Subject: CMS(Content Management System) Université Alioune DIOP de Bambey UFR Sciences Appliquées et Technologies de l’Information et de la Communication.
Theme Three Speaking Questions
CONJUGAISON.
What about discrete point skills?
Infinitive There are 3 groups of REGULAR verbs in French: verbs ending with -ER = 1st group verbs ending with -IR = 2nd group verbs ending with -RE = 3rd.
The Passé Composé Tense
Why is it important to plan ahead for the future?
Quantum Computer A New Era of Future Computing Ahmed WAFDI ??????
the Periodic Table the Periodic Table 2017/2018 Made by : NEDJAR NASSIMA Made by : NEDJAR NASSIMA MS:HAMZA 1.
Unité 1 Leçon 3c Updated 10/18/12.
Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics.
Les Fruits :.
Theme One Speaking Questions
© 2004 Prentice-Hall, Inc.Chap 4-1 Basic Business Statistics (9 th Edition) Chapter 4 Basic Probability.
F RIENDS AND FRIENDSHIP Project by: POPA BIANCA IONELA.
Setting SMART Objectives Training. ©SHRM Introduction Of all the functions involved in management, planning is the most important. As the old saying.
P&ID SYMBOLS. P&IDs Piping and Instrumentation Diagrams or simply P&IDs are the “schematics” used in the field of instrumentation and control (Automation)
1 ISO/TC 176/SC 2/N1219 ISO 9001:2015 Revision overview - General users July 2014.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 1-1 Chapter 1 Introduction and Data Collection Basic Business Statistics 10 th Edition.
G. Peter Zhang Neurocomputing 50 (2003) 159–175 link Time series forecasting using a hybrid ARIMA and neural network model Presented by Trent Goughnour.
Lect12EEE 2021 Differential Equation Solutions of Transient Circuits Dr. Holbert March 3, 2008.
Essai
Introduction to Computational Journalism: Thinking Computationally JOUR479V/779V – Computational Journalism University of Maryland, College Park Nick Diakopoulos,
High-Availability Linux Services And Newtork Administration Bourbita Mahdi 2016.
Français - couleurs - pays - drapeaux
Le soir Objectifs: Talking about what you do in the evening
Qu’est-ce que tu as dans ta trousse?
Quelle est la date aujourd’hui?
Quelle est la date aujourd’hui?
Qu’est-ce que tu as dans ta trousse?
Forum national sur l’IMT de 2004.
Quelle est la date aujourd’hui?
Definition Division of labour (or specialisation) takes place when a worker specialises in producing a good or a part of a good.
9-1 What is Creativity?. 9-2 Creativity is… Person Process Produce Press.
Roots of a Polynomial: Root of a polynomial is the value of the independent variable at which the polynomial intersects the horizontal axis (the function.
Quelle est la date aujourd’hui?
Quelle est la date aujourd’hui?
Standards Certification Education & Training Publishing Conferences & Exhibits Automation Connections ISA EXPO 2006 Wed, 1:00 Oct 18.
sortir avec mes copains faire les magasins jouer à des vidéo
WRITING A PROS AND CONS ESSAY. Instructions 1. Begin your essay by introducing your topic Explaining that you are exploring the advantages and disadvantages.
What’s the weather like?
Making PowerPoint Slides Avoiding the Pitfalls of Bad Slides.
POWERPOINT PRESENTATION FOR INTRODUCTION TO THE USE OF SPSS SOFTWARE FOR STATISTICAL ANALISYS BY AMINOU Faozyath UIL/PG2018/1866 JANUARY 2019.
By : HOUSNA hebbaz Computer NetWork. Plane What is Computer Network? Type of Network Protocols Topology.
C021TV-I1-S4.
5S Methodology How to implement "5S" and get extraordinary results.
1 Sensitivity Analysis Introduction to Sensitivity Analysis Introduction to Sensitivity Analysis Graphical Sensitivity Analysis Graphical Sensitivity Analysis.
Avoiding the Pitfalls of Bad Slides Tips to be Covered Outlines Slide Structure Fonts Colour Background Graphs Spelling and Grammar Conclusions Questions.
Le Passé Composé (Perfect Tense)
Lequel The Last Part.
University : Ammar Telidji Laghouat Faculty : Technology Department : Electronics 3rd year Telecommunications Professor : S.Benghouini Student: Tadj Souad.
L’orchestre des animaux
Soutenance de thèse: Okba Taouali 1 02/08/2019 Fathia AZZOUZI, Adam BOURAS, Nizar JEBLI Conceptual specifications of a cooperative inter- machines dialogue.
Genetic Algorithm for Variable Selection Jennifer Pittman ISDS Duke University.
Over Sampling methods IMBLEARN Package Realised by : Rida benbouziane.
IMPROVING PF’s M&E APPROACH AND LEARNING STRATEGY Sylvain N’CHO M&E Manager IPA-Cote d’Ivoire.
M’SILA University Information Communication Sciences and technology
Transcription de la présentation:

Metaheuristic Methods and Their Applications Lin-Yu Tseng ( 曾怜玉 ) Institute of Networking and Multimedia Department of Computer Science and Engineering National Chung Hsing University

OUTLINE I. Optimization Problems II. Strategies for Solving NP-hard Optimization Problems III. What is a Metaheuristic Method? IV. Trajectory Methods V. Population-Based Methods VI. Multiple Trajectory Search VII. The Applications of Metaheuristic Methods VIII. Conclusions

I. Optimization Problems Computer Science Traveling Salesman Problem Maximum Clique Problem Operational Research Flow Shop Scheduling Problem P – Median Problem Many optimization problems are NP-hard.

Optimization problems Calculus-based method

Hill climbing

How about this?

An example : TSP (Traveling Salesman Problem) A solutionA sequence Tour length = Tour length = 63 There are (n-1)! tours in total

II. Strategies for Solving NP-hard Optimization Problems Branch-and-Bound  Find exact solution Approximation Algorithms –e.g. There is an approximation algorithm for TSP which can find a tour with tour length ≦ 1.5× (optimal tour length) in O(n 3 ) time. Heuristic Methods  Deterministic Metaheuristic Methods  Heuristic + Randomization

III. What is a Metaheruistic Method? Meta : in an upper level Heuristic : to find A metaheuristic is formally defined as an iterative generation process which guides a subordinate heuristic by combining intelligently different concepts for exploring and exploiting the search space, learning strategies are used to structure information in order to find efficiently near-optimal solutions. [Osman and Laporte 1996].

Fundamental Properties of Metaheuristics [Blum and Roli 2003] Metaheuristics are strategies that “guide” the search process. The goal is to efficiently explore the search space in order to find (near-)optimal solutions. Techniques which constitute metaheuristic algorithms range from simple local search procedures to complex learning processes. Metaheuristic algorithms are approximate and usually non-deterministic.

Fundamental Properties of Metaheuristics (cont.) They may incorporate mechanisms to avoid getting trapped in confined areas of the search space. The basic concepts of metaheuristics permit an abstract level description. Metaheuristics are not problem-specific. Metaheuristics may make use of domain-specific knowledge in the form of heuristics that are controlled by the upper level strategy. Today’s more advanced metaheuristics use search experience (embodied in some form of memory) to guide the search.

IV. Trajectory Methods 1. Basic Local Search : Iterative Improvement Improve (N(S)) can be  F i rst improvement ‚ Best improvement ƒ Intermediate option

Trajectory Methods x1x1 x2x2 x3x3 X4X4 X5X5

2. Simulated Annealing – Kirkpatrick (Science 1983) Probability Temperature T may be defined as Random walk + iterative improvement

3. Tabu Search – Glover 1986 Simple Tabu Search Tabu list Tabu tenure: the length of the tabu list Aspiration condition

Tabu Search

4. Variable Neighborhood Search – Hansen and Mladenović 1999 Composed of three phase ① shaking ② local search ③ move A set of neighborhood structures,

Variable Neighborhood Search x’’ N1N1 x x’ Best x x’’ N1N1 x x’ Best x’’ N2N2 x Best x’ x’’ x N1N1 x Best x’ x’’

V. Population-Based Methods 1. Genetic Algorithm – Holland 1975 Coding of a solution --- Chromosome Fitness function --- Related to objective function Initial population

An example : TSP (Traveling Salesman Problem) A solutionA sequence Tour length = Tour length = 63

Reproduction (Selection) Crossover Mutation Genetic Algorithm

Example 1 Maximize f(x) = x 2 where x  I and 0  x  31

1.Coding of a solution : A five-bit integer, e.g Fitness function : F(x) = f(x) = x 2 3.Initial population : (Randomly generated)

Reproduction Roulette Wheel

Reproduction

Crossover

Mutation The probability of mutation Pm = bits * = 0.02 bits

Example 2 Word matching problem “to be or not to be”  “tobeornottobe” (1/26) 13 =  The lowercase letters in ASCII are represented by numbers in the range [97, 122] [116,111,98,101,114,110, 116, 116, 111, 98,101]

Initial population

The corresponding strings rzfqdhujardbe niedwrvrjahfj cyueisosqvcb fbfvgramtekuvs kbuqrtjtjensb fwyqykktzyojh tbxblsoizggwm dtriusrgkmbg jvpbgemtpjalq

2. Ant Colony Optimization – Dorigo 1992 Pheromone

Initialization Loop Each ant applies a state transition rule to incrementally build a solution and applies a local updating rule to the pheromone Until each of all ants has built a complete solution A global pheromone updating rule is applied Until End_Condition

Example : TSP A simple heuristics – a greedy method: choose the shortest edge to go out of a node Solution : 15432

Each ant builds a solution using a step by step constructive decision policy. How ant k choose the next city to visit? Where, be a distance measure associated with edge (r,s)

Local update of pheromone where is the initial pheromone level Global update of pheromone where globally best tour

3. Particle Swarm Optimization – Kennedy and Eberhart 1995 positionsvelocitiesPopulation initialized by assigning random positions and velocities; potential solutions are then flown through hyperspace. Each particle keeps track of its “best” (highest fitness) position in hyperspace. pBest –This is called “pBest” for an individual particle. gBest –It is called “gBest” for the best in the population. pBest gBestAt each time step, each particle stochastically accelerates toward its pBest and gBest.

Particle Swarm Optimization Process 1.Initialize population in hyperspace. Includes position and velocity. 2.Evaluate fitness of individual particle. 3.Modify velocities based on personal best and global best. 4.Terminate on some condition. 5.Go to step 2.

Initialization: - Positions and velocities

Modify velocities - based on personal best and global best. xtxt gBest pBest v x t+1 Here I am, now! New Position !

Particle Swarm Optimization Modification of velocities and positions v t+1 = v t + c 1 * rand() * (pBest - x t ) + c 2 * rand() * (gBest - x t ) x t+1 = x t + v t+1

VI. Multiple Trajectory Search (MTS) 1.MTS begins with multiple initial solutions, so multiple trajectories will be produced in the search process. 2.Initial solutions will be classified as foreground solutions and background solutions, and the search will be focused mainly on foreground solutions.

3.The search neighborhood begins with a large-scale neighborhood, then it contracts step by step until it reaches the pre-defined tiny size. At this time, it is reset to its original size. 4.Three candidate local search methods were designed. When we want to search the neighborhood of a solution, they will be tested first in order we can choose the one that best fits the landscape of the solution.

The MTS focuses its search mainly on foreground solutions. It does an iterated local search using one of three candidate local search methods. By choosing a local search method that best fits the landscape of a solution’s neighborhood, the MTS may find its way to a local optimum or the global optimum

Step 1 : Find M initial solutions that are uniformly distributed over the solution space. Step 2 : Iteratively apply local search to all these M initial solutions. Step 3 : Choose K foreground solutions based on the grades of the solutions. Step 4: Iteratively apply local search to K foreground solutions. Step 5 : If termination condition is met then stop else go to Step 3.

Three local search methods begin their search in a very large “neighborhood”. Then the neighborhood contracts step by step until it reaches a pre-defined tiny size, after then, it is reset to its original size.

Local Search 1 This local search method searches along each of n dimensions in a randomly permuted order. Local Search 2 This local search method searches along a direction by considering about n/4 dimensions. Three Local Search Methods

Local Search 3 This local search method also searches along each of n dimension in a randomly permuted order, but it evaluate multiple solutions along each dimension within a pre- specified range. Three Local Search Methods

Multiple Trajectory Search for Large Scale Global Optimization Winner of 2008 IEEE World Congress on Computational Intelligence Competition on Large Scale Global Optimization

Multiple Trajectory Search for Unconstrained/Constrained Multi-Objective Optimization The Second and the Third Places of 2009 IEEE Congress on Evolutionary Computation Competition on Performance Assessment of Constrained/Bound Constrained Multi- Objective Optimization Algorithm

VII. The Applications of Metaheuristics 1.Solving NP-hard Optimization Problems Traveling Salesman Problem Maximum Clique Problem Flow Shop Scheduling Problem P-Median Problem 2.Search Problems in Many Applications Feature Selection in Pattern Recognition Automatic Clustering Machine Learning (e.g. Neural Network Training)

VIII. Conclusions For NP-hard optimization problems and complicated search problems, metaheuristic methods are very good choices for solving these problems. More efficient than branch-and-bound. Obtain better quality solutions than heuristic methods. Hybridization of metaheuristics How to make the search more systematic ? How to make the search more controllable ? How to make the performance scalable?

References 1.Osman, I.H., and Laporte,G. “Metaheuristics:A bibliography”. Ann. Oper. Res. 63, 513–623, Blum, C., and Andrea R. “Metaheuristics in Combinatorial Optimization: Overview and Conceptual Comparison”. ACM Computing Surveys, 35(3), 268–308, Kirkpatrick, S., Gelatt. C. D., and Vecchi, M. P. “Optimization by simulated annealing”, Science, 13 May , 4598, 671–680, Glover, F. “Future paths for integer programming and links to artificial intelligence”, Comput. Oper. Res. 13, 533–549, Hansen, P. and Mladenović, N. An introduction to variable neighborhood search. In Metaheuristics: Advances and trends in local search paradigms for optimization, S. Voß, S. Martello, I. Osman, and C. Roucairol, Eds. Kluwer Academic Publishers, Chapter 30, 433–458, Holland, J. H. Adaption in natural and artificial systems. The University of Michigan Press,Ann Harbor, MI Dorigo, M. Optimization, learning and natural algorithms (in italian). Ph.D. thesis, DEI, Politecnico di Milano, Italy. pp. 140, Kennedy, J. and Eberhart, R. “Particle Swarm Optimization”, Proceedings of the 1995 IEEE International Conference on Neural Networks, pp , IEEE Press, 1995.

References 9.Lin-Yu Tseng* and Chun Chen, (2008) “Multiple Trajectory Search for Large Scale Global Optimization,” Proceedings of the 2008 IEEE Congress on Evolutionary Computation, June 2-6, 2008, Hong Kong. 10.Lin-Yu Tseng* and Chun Chen, (2009) “Multiple Trajectory Search for Unconstrained/Constrained Multi-Objective Optimization,” Proceedings of the 2009 IEEE Congress on Evolutionary Computation, May 18-21, Trondheim, Norway.

謝謝 ! 祝各位 身體健康 心情愉快 !