Posts by Collection

publications

Is CC-(1+1) EA more efficient than (1+1) EA on either separable or inseparable problems?

Published in IEEE Congress on Evolutionary Computation (CEC), 2023

Abstract: This paper is about the cooperative co-evolutionary algorithms. This paper investigates cooperative co-evolutionary algorithms (CoEAs) for large-scale optimization problems, focusing on the runtime analysis to understand their behavior. By proving that the basic cooperative co-evolutionary (1+1) EA has an expected optimization time of Θ(n log n) on linear functions, it solves an open conjecture. Empirical analysis on NK-LANDSCAPE and k-MAXSAT problems shows that performance can be optimized by adjusting block length, providing more precise runtime bounds and insights on more complicated inseparable problems.

Recommended citation: Per Kristian Lehre, Shishen Lin. (2023). "Is CC-(1+1) EA more efficient than (1+1) EA on either separable or inseparable problems?" IEEE Congress on Evolutionary Computation (CEC). 8 pages, Chicago, USA, 2023. https://ieeexplore.ieee.org/document/10254149

Runtime Analysis of a Co-Evolutionary Algorithm: Overcoming Negative Drift in Maximin-Optimisation

Published in Proceedings of the Companion Conference on Genetic and Evolutionary (GECCO), 2023

Abstract: This paper is about a mathematically rigorous method to analyze co-evolutionary algorithms, successfully obtaining Maximin-solutions with improved runtime analysis and new mathematical tools.

Recommended citation: Mario Alejandro Hevia Fajardo, Per Kristian Lehre, and Shishen Lin. (2023). "Runtime analysis of a Co-Evolutionary Algorithm: Overcoming Negative Drift in Maximin-Optimisation." Proceedings of the Companion Conference on Genetic and Evolutionary Computation (GECCO Companion). 4 pages, Lisbon, Portugal, 2023. https://dl.acm.org/doi/10.1145/3594805.3607132

Runtime Analysis of a Co-Evolutionary Algorithm: Overcoming Negative Drift in Maximin-Optimisation

Published in Proceedings of the 16th ACM/SIGEVO Conference on Foundations of Genetic Algorithms (FOGA), 2023

Abstract: Co-evolutionary algorithms have found several applications in game-theoretic applications and optimisation problems with an adversary, particularly where the strategy space is discrete and exponentially large, and where classical game-theoretic methods fail. However, the application of co-evolutionary algorithms is difficult because they often display pathological behaviour, such as cyclic behaviour and evolutionary forgetting. These challenges have prevented the broad application of co-evolutionary algorithms.

Recommended citation: Mario Alejandro Hevia Fajardo, Per Kristian Lehre, and Shishen Lin. (2023). "Runtime analysis of a Co-Evolutionary Algorithm: Overcoming Negative Drift in Maximin-Optimisation." Proceedings of the 16th ACM/SIGEVO Conference on Foundations of Genetic Algorithms (FOGA). 11 pages, Potsdam, Germany, 2023. https://dl.acm.org/doi/10.1145/3594805.3607132

Concentration Tail-Bound Analysis of Coevolutionary and Bandit Learning Algorithms

Published in Proceedings of the 33rd International Joint Conference on Artificial Intelligence (IJCAI), 2024

Abstract: Runtime analysis, as a branch of the theory of AI, studies how the number of iterations algorithms take before finding a solution (its runtime) depends on the design of the algorithm and the problem structure. Drift analysis is a state-of-the-art tool for estimating the runtime of randomised algorithms, such as bandit and evolutionary algorithms. Drift refers roughly to the expected progress towards the optimum per iteration. This paper considers the problem of deriving concentration tail-bounds on the runtime of algorithms. It provides a novel drift theorem that gives precise exponential tail-bounds given positive, weak, zero and even negative drift. Previously, such exponential tail bounds were missing in the case of weak, zero, or negative drift.

Recommended citation: Per Kristian Lehre, and Shishen Lin. (2024). "Concentration Tail-Bound Analysis of Coevolutionary and Bandit Learning Algorithms." Proceedings of the 33rd International Joint Conference on Artificial Intelligence (IJCAI). 7 pages, Jeju, South Korea, 2024. https://www.ijcai.org/proceedings/2024/0767.pdf

Overcoming Binary Adversarial Optimisation with Competitive Coevolution

Published in 18th International Conference on Parallel Problem Solving From Nature (PPSN), 2024

Abstract: Co-evolutionary algorithms (CoEAs), which pair candidate designs with test cases, are frequently used in adversarial optimisation, particularly for binary test-based problems where designs and tests yield binary outcomes. The effectiveness of designs is determined by their performance against tests, and the value of tests is based on their ability to identify failing designs, often leading to more sophisticated tests and improved designs. However, CoEAs can exhibit complex, sometimes pathological behaviours like disengagement. Through runtime analysis, we aim to rigorously analyse whether CoEAs can efficiently solve test-based adversarial optimisation problems in an expected polynomial runtime.

Recommended citation: Per Kristian Lehre, and Shishen Lin. (2024). " Overcoming Binary Adversarial Optimisation with Competitive Coevolution." 18th International Conference on Parallel Problem Solving From Nature (PPSN). 14 pages, Hagenberg, Austria, 2024. https://arxiv.org/pdf/2407.17875

No Free Lunch Theorem and Black-Box Complexity Analysis for Adversarial Optimisation

Published in The Thirty-eighth Annual Conference on Neural Information Processing Systems (NeurIPS), 2024

Abstract: Black-box optimisation is one of the important areas in optimisation. The original No Free Lunch (NFL) Theorems highlight the limitations of traditional black-box optimisation and learning algorithms, serving as a theoretical foundation for traditional optimisation. No Free Lunch Analysis in adversarial (also called \maximin) optimisation is a long-standing problem \cite{wolpert1997no,wolpert_coevolutionary_2005}. This paper first rigorously proves a (NFL) theorem for general black-box adversarial optimisation when considering Nash Equilibrium (NE) as the solution concept. We emphasise the solution concept (i.e. define the optimality in adversarial optimisation) as the key in our NFL theorem. In particular, if Nash Equilibrium is considered as the solution concept, then the average performance of all black-box adversarial optimisation algorithms is the same. Moreover, we first introduce black-box complexity to analyse the black-box adversarial optimisation algorithm. We employ Yao’s maximin principle and our new NFL Theorem to provide general lower bounds for query complexity of finding Nash Equilibrium in adversarial optimisation. Finally, we illustrate the practical ramifications of our results on simple two-player zero-sum games. More specifically, no black-box optimisation algorithm for finding the unique Nash equilibrium in two-player zero-sum games can exceed the logarithmic complexity relative to search space size. Meanwhile, no black-box algorithm can solve any bimatrix game with unique NE faster than the linear query complexity in terms of the size of input payoff matrices.

Recommended citation: Per Kristian Lehre, and Shishen Lin. (2024). " No Free Lunch Theorem and Black-Box Complexity Analysis for Adversarial Optimisation." 38th Annual Conference on Neural Information Processing Systems (NeurIPS). 9 pages, Vancouver, Canada, 2024. to appear. TBD

talks

teaching

Machine Learning and Intelligent Data Analysis

Undergraduate course, University of Birmingham, School of Computer Science, 2021

Machine learning studies how computers can autonomously learn from available data, without being explicitly programmed. The ‘information revolution’ has generated large amounts of data, but valuable information is often hidden and hence unusable. The module will provide a solid foundation to machine learning and advanced data analysis. It will give an overview of the core concepts, methods, and algorithms for analysing and learning from data. The emphasis will be on the underlying theoretical foundations, illustrated through a set of methods widely used in practice. This will provide the student with a good understanding of how, why and when do various modern machine learning and data analysis methods work. In this module, I work as teaching assistant to help student understand the course in weekly Q&A meeting and help to mark the courseworks.

Evolutionary Computation

Undergraduate/Master course, University of Birmingham, School of Computer Science, 2022

Evolutionary Algorithms (EAs) are various biologically inspired randomised optimisation techniques (or randomised heuristics). EAs aim to find a global optimum and assume little knowledge of fitness functions or how fitness functions are defined (the fitness function is what we want to optimise) compared with the gradient-based method (gradient descent). EAs can provide more useful solutions for some real-world scenarios suitable for derivative-free methods to be solved. In this module, I work as teaching assistant to help student understand the course in weekly Q&A meeting and help to mark the courseworks.

Mathematical Foundations of Artificial Intelligence and Machine Learning

Undergraduate/Master course, University of Birmingham, School of Computer Science, 2022

We will start with understanding basic mathematical structures in which data and machine learning models are formulated. In this module, I work as teaching assistant to help student understand the course in weekly Q&A meeting and help to write bi-weekly quiz sheets for students.

Evolutionary Computation

Undergraduate/Master course, University of Birmingham, School of Computer Science, 2023

The same introduction as the previous EC module.

Evolutionary Computation

Undergraduate/Master course, University of Birmingham, School of Computer Science, 2024

The same introduction as the previous EC module.