Loading...
 

Keynotes



We are thrilled to welcome the following keynote speakers.

Valiant, Leslie

Leslie Valiant

Harvard University, United States


Biological Evolution as a Form of Learning

Abstract

Living organisms function according to protein circuits. Darwin's theory of evolution suggests that these circuits have evolved through variation guided by natural selection. However, it is currently unknown what variation mechanisms can give rise to protein circuits of the complexity found in biology, within realistic population sizes and realistic numbers of generations.

We suggest that computational learning theory offers the framework for investigating this question, of how circuits can come into being via a Darwinian process without a designer. We formulate evolution as a form of learning from examples. The targets of the learning process are the protein expression functions that come closest to best behavior in the specific environment. The learning process is constrained so that the feedback from the experiences is Darwinian. We formulate a notion of evolvability that distinguishes function classes that are evolvable with polynomially bounded resources from those that are not. The dilemma is that if the function class that describes the expression levels of proteins in terms of each other, is too restrictive, then it will not support biology, while if it is too expressive then no evolution algorithm will exist to navigate it. We shall review current work in this area.

Biosketch

Leslie Valiant was educated at King's College, Cambridge; Imperial College, London; and at Warwick University where he received his Ph.D. in computer science in 1974. He is currently T. Jefferson Coolidge Professor of Computer Science and Applied Mathematics in the School of Engineering and Applied Sciences at Harvard University, where he has taught since 1982. Before coming to Harvard he had taught at Carnegie Mellon University, Leeds University, and the University of Edinburgh.

His work has ranged over several areas of theoretical computer science, particularly complexity theory, learning, and parallel computation. He also has interests in computational neuroscience, evolution and artificial intelligence and is the author of two books, Circuits of the Mind, and Probably Approximately Correct.

He received the Nevanlinna Prize at the International Congress of Mathematicians in 1986, the Knuth Award in 1997, the European Association for Theoretical Computer Science EATCS Award in 2008, and the 2010 A. M. Turing Award. He is a Fellow of the Royal Society (London) and a member of the National Academy of Sciences (USA).



Photo Deb 2020

Kalyanmoy Deb

Michigan State University, United States


Evolutionary Computation's Niche for Solving Multi-Criterion Optimization Problems

Abstract

Evolutionary computation (EC) involves a careful collaborative and iterative update of a population of solutions to reach near a desired target. In a single-objective optimization problem, the respective optimal solution is often the single target. In a multi-criterion optimization problem, the target is a set of Pareto-optimal solutions. Although EC field started with solving single-objective problems, EC researchers soon realized that they were ideal for finding a well-diversed set of multiple Pareto-optimal solutions simultaneously for multi-criterion optimization problems, a clear niche of EC compared to their point-based classical counterparts. In this keynote talk, we provide a brief chronology of events on the evolutionary multi-criterion optimization (EMO) field in the past almost three decades, key challenges it faced, and key events and publications which pushed the field forward. Moreover, we shall provide a brief account of the current activities and speaker's view of what lies ahead in this talk.

Biosketch

Kalyanmoy Deb is Koenig Endowed Chair Professor at Department of Electrical and Computer Engineering in Michigan State University. His research interests are in evolutionary optimization and their application in multi-criterion optimization, bilevel optimization, modeling, and machine learning. He was awarded IEEE CIS EC Pioneer award, Infosys Prize, TWAS Prize in Engineering Sciences, CajAstur Mamdani Prize, Distinguished Alumni Award from IIT Kharagpur, Edgeworth-Pareto award, Bhatnagar Prize in Engineering Sciences, and Bessel Research award from Germany. He is fellow of IEEE and ASME. He has published over 520 research papers with Google Scholar citation of over 137,000 with h-index 115. More information can be found from http://www.coin-lab.org.

Photo Whitley

Darrell Whitley

Colorado State University, United States


Removing Randomness from Evolutionary Algorithms

Abstract

It is natural to think of Evolutionary Algorithms as highly stochastic search methods. This can also make Evolutionary Algorithms, and particularly recombination, quite difficult to analyze. One way to reduce randomness involves the quadratization of functions, which is commonly used by modern optimization methods, and also has applications in quantum computing. After a function is made quadratic, random mutation is obsolete and unnecessary; the location of improving moves can be calculated deterministically, on average in O(1) time. Seemingly impossible problems, such as the Needle-in-a-Haystack, becomes trivial to solve in quadratic form. One can also provably tunnel, or jump, between local optima and quasilocal optima in O(n) time using deterministic genetic recombination. The talk also explores how removing randomness from Evolutionary Algorithms might provide new insights into natural evolution. Finally, a form of evolutionary algorithm is proposed where premature convergence is impossible and the evolutionary potential of the population remains open-ended.

Biosketch

Darrell Whitley is a Professor of Computer Science at Colorado State University. He served as the Chair of the International Society of Genetic Algorithm from 1993 to 1997, and as the Editor-in-Chief of the journal Evolutionary Computation from 1997 to 2003. He was Chair of the Governing Board of ACM SIGEVO from 2007 to 2011. He was named an ACM Fellow in 2019 for his contributions to the field of genetic and evolutionary computation.