Abstract
We consider the problem of efficiently searching for high-performing neural architectures whilst simultaneously favouring networks of reduced complexity. It is theorised that a complementary set of proxies can be employed for multi-objective optimisation to balance model performance with the size of the network. We demonstrate that a low-cost proxy for the test accuracy of a candidate architecture can be derived from a series of inferences alone. The proxy is paired with a complexity metric based on the number of parameters in the model and the two properties are used in a multi-objective setting. A Pareto Archived Evolutionary Strategy is used to optimise the two objectives simultaneously and deliver a diverse collection of solutions as output. This method is shown to successfully discover low-complexity architectures with minor loss of accuracy as compared to the global optima and does so with statistical reliability. This work offers a proof-of-concept Neural Architecture Search algorithm that removes training from the process entirely. The proposed approach is examined in terms of search behaviour and the complexity reduction that can be achieved by comparing discovered solutions to the top-performing architectures in the search space.
Original language | English |
---|---|
Title of host publication | Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO 2025, Málaga, Spain, July 14-18, 2025 |
Editors | Gabriela Ochoa |
Publisher | Association for Computing Machinery (ACM) |
Number of pages | 4 |
ISBN (Electronic) | 979-8-4007-1464-1/2025/07 |
DOIs | |
Publication status | Accepted/In press - 19 Mar 2025 |
Event | GECCO 2025: The Genetic and Evolutionary Computation Conference - Málaga, Málaga, Spain Duration: 14 Jul 2025 → 18 Jul 2025 https://gecco-2025.sigevo.org/HomePage |
Conference
Conference | GECCO 2025 |
---|---|
Abbreviated title | GECCO |
Country/Territory | Spain |
City | Málaga |
Period | 14 Jul 2025 → 18 Jul 2025 |
Internet address |
Keywords
- neural networks
- neuroevoluation
- Neural Architecture Search
- fitness approximation
- multi-objective optimisation