Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

publications

Is CC-(1+1) EA more efficient than (1+1) EA on either separable or inseparable problems?

Published in IEEE Congress on Evolutionary Computation (CEC), 2023

Abstract: This paper is about the cooperative co-evolutionary algorithms. This paper investigates cooperative co-evolutionary algorithms (CoEAs) for large-scale optimization problems, focusing on the runtime analysis to understand their behavior. By proving that the basic cooperative co-evolutionary (1+1) EA has an expected optimization time of Θ(n log n) on linear functions, it solves an open conjecture. Empirical analysis on NK-LANDSCAPE and k-MAXSAT problems shows that performance can be optimized by adjusting block length, providing more precise runtime bounds and insights on more complicated inseparable problems.

Recommended citation: Per Kristian Lehre, Shishen Lin. (2023). "Is CC-(1+1) EA more efficient than (1+1) EA on either separable or inseparable problems?" IEEE Congress on Evolutionary Computation (CEC). 8 pages, Chicago, USA, 2023. https://ieeexplore.ieee.org/document/10254149

Runtime Analysis of a Co-Evolutionary Algorithm: Overcoming Negative Drift in Maximin-Optimisation

Published in Proceedings of the 16th ACM/SIGEVO Conference on Foundations of Genetic Algorithms (FOGA), 2023

Abstract: This paper is about a mathematically rigorous method to analyze co-evolutionary algorithms, successfully obtaining Maximin-solutions with improved runtime analysis and new mathematical tools.

Recommended citation: Mario Alejandro Hevia Fajardo, Per Kristian Lehre, and Shishen Lin. (2023). "Runtime analysis of a Co-Evolutionary Algorithm: Overcoming Negative Drift in Maximin-Optimisation." Proceedings of the 16th ACM/SIGEVO Conference on Foundations of Genetic Algorithms (FOGA). 11 pages, Potsdam, Germany, 2023. https://dl.acm.org/doi/10.1145/3594805.3607132

Concentration Tail-Bound Analysis of Coevolutionary and Bandit Learning Algorithms

Published in Proceedings of the 33rd International Joint Conference on Artificial Intelligence (IJCAI), 2024

Abstract: Runtime analysis, as a branch of the theory of AI, studies how the number of iterations algorithms take before finding a solution (its runtime) depends on the design of the algorithm and the problem structure. Drift analysis is a state-of-the-art tool for estimating the runtime of randomised algorithms, such as bandit and evolutionary algorithms. Drift refers roughly to the expected progress towards the optimum per iteration. This paper considers the problem of deriving concentration tail-bounds on the runtime of algorithms. It provides a novel drift theorem that gives precise exponential tail-bounds given positive, weak, zero and even negative drift. Previously, such exponential tail bounds were missing in the case of weak, zero, or negative drift.

Recommended citation: Per Kristian Lehre, and Shishen Lin. (2024). "Concentration Tail-Bound Analysis of Coevolutionary and Bandit Learning Algorithms." Proceedings of the 33rd International Joint Conference on Artificial Intelligence (IJCAI). 7 pages, Jeju, South Korea, 2024. https://www.ijcai.org/proceedings/2024/0767.pdf

Overcoming Binary Adversarial Optimisation with Competitive Coevolution

Published in 18th International Conference on Parallel Problem Solving From Nature (PPSN), 2024

Abstract: Co-evolutionary algorithms (CoEAs), which pair candidate designs with test cases, are frequently used in adversarial optimisation, particularly for binary test-based problems where designs and tests yield binary outcomes. The effectiveness of designs is determined by their performance against tests, and the value of tests is based on their ability to identify failing designs, often leading to more sophisticated tests and improved designs. However, CoEAs can exhibit complex, sometimes pathological behaviours like disengagement. Through runtime analysis, we aim to rigorously analyse whether CoEAs can efficiently solve test-based adversarial optimisation problems in an expected polynomial runtime.

Recommended citation: Per Kristian Lehre, and Shishen Lin. (2024). " Overcoming Binary Adversarial Optimisation with Competitive Coevolution." 18th International Conference on Parallel Problem Solving From Nature (PPSN). 14 pages, Hagenberg, Austria, 2024. to appear. https://arxiv.org/pdf/2407.17875

talks

teaching

Machine Learning and Intelligent Data Analysis

Undergraduate course, University of Birmingham, School of Computer Science, 2021

Machine learning studies how computers can autonomously learn from available data, without being explicitly programmed. The ‘information revolution’ has generated large amounts of data, but valuable information is often hidden and hence unusable. The module will provide a solid foundation to machine learning and advanced data analysis. It will give an overview of the core concepts, methods, and algorithms for analysing and learning from data. The emphasis will be on the underlying theoretical foundations, illustrated through a set of methods widely used in practice. This will provide the student with a good understanding of how, why and when do various modern machine learning and data analysis methods work. In this module, I work as teaching assistant to help student understand the course in weekly Q&A meeting and help to mark the courseworks.

Evolutionary Computation

Undergraduate/Master course, University of Birmingham, School of Computer Science, 2022

Evolutionary Algorithms (EAs) are various biologically inspired randomised optimisation techniques (or randomised heuristics). EAs aim to find a global optimum and assume little knowledge of fitness functions or how fitness functions are defined (the fitness function is what we want to optimise) compared with the gradient-based method (gradient descent). EAs can provide more useful solutions for some real-world scenarios suitable for derivative-free methods to be solved. In this module, I work as teaching assistant to help student understand the course in weekly Q&A meeting and help to mark the courseworks.

Mathematical Foundations of Artificial Intelligence and Machine Learning

Undergraduate/Master course, University of Birmingham, School of Computer Science, 2022

We will start with understanding basic mathematical structures in which data and machine learning models are formulated. In this module, I work as teaching assistant to help student understand the course in weekly Q&A meeting and help to write bi-weekly quiz sheets for students.

Evolutionary Computation

Undergraduate/Master course, University of Birmingham, School of Computer Science, 2023

The same introduction as the previous EC module.

Evolutionary Computation

Undergraduate/Master course, University of Birmingham, School of Computer Science, 2024

The same introduction as the previous EC module.