Skip to content
/ evox Public

Distributed GPU-Accelerated Framework for Evolutionary Computation. Comprehensive Library of Evolutionary Algorithms & Benchmark Problems.

License

Notifications You must be signed in to change notification settings

EMI-Group/evox

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

EvoX Logo

arXiv Documentation PyPI-Version Python-Version Discord Server QQ Group GitHub User's Stars


๐ŸŒŸDistributed GPU-accelerated Framework for Scalable Evolutionary Computation๐ŸŒŸ


Building upon JAX and Ray, EvoX offers a comprehensive suite of 50+ Evolutionary Algorithms (EAs) and a wide range of 100+ Benchmark Problems/Environments, all benefiting from distributed GPU-acceleration. It facilitates efficient exploration of complex optimization landscapes, effective tackling of black-box optimization challenges, and deep dives into neuroevolution with Brax. With a foundation in functional programming and hierarchical state management, EvoX offers a user-friendly and modular experience. For more details, please refer to our Paper and Documentation / ๆ–‡ๆกฃ.

Key Features

  • ๐Ÿš€ Fast Performance:

    • Experience GPU-Accelerated optimization, achieving speeds over 100x faster than traditional methods.
    • Leverage the power of Distributed Workflows for even more rapid optimization.
  • ๐ŸŒ Versatile Optimization Suite:

    • Cater to all your needs with both Single-objective and Multi-objective optimization capabilities.
    • Dive into a comprehensive library of Benchmark Problems/Environments, ensuring robust testing and evaluation.
    • Explore the frontier of AI with extensive tools for Neuroevolution/RL tasks.
  • ๐Ÿ› ๏ธ Designed for Simplicity:

    • Embrace the elegance of Functional Programming, simplifying complex algorithmic compositions.
    • Benefit from Hierarchical State Management, ensuring modular and clean programming.

Main Contents

Evolutionary Algorithms for Single-objective Optimization

Category Algorithms
Differential Evolution CoDE, JaDE, SaDE, SHADE, IMODE, ...
Evolution Strategy CMA-ES, PGPE, OpenES, CR-FM-NES, xNES, ...
Particle Swarm Optimization FIPS, CSO, CPSO, CLPSO, SL-PSO, ...

Evolutionary Algorithms for Multi-objective Optimization

Category Algorithms
Dominance-based NSGA-II, NSGA-III, SPEA2, BiGE, KnEA, ...
Decomposition-based MOEA/D, RVEA, t-DEA, MOEAD-M2M, EAG-MOEAD, ...
Indicator-based IBEA, HypE, SRA, MaOEA-IGD, AR-MOEA, ...

For a comprehensive list and further details of all algorithms, please check the API Documentation.

Benchmark Problems/Environments

Category Problems/Environments
Numerical DTLZ, LSMOP, MaF, ZDT, CEC'22, ...
Neuroevolution/RL Brax, Gym, TorchVision Dataset, ...

For a comprehensive list and further details of all benchmark problems/environments, please check the API Documentation.

Setting Up EvoX

Install evox effortlessly via pip:

pip install evox

Note: To setup EvoX with GPU acceleration capabilities, you will need to setup JAX first. For detials, please refer to our comprehensive Installation Guide. Additionally, you can watch our instructional videos:

๐ŸŽฅ EvoX Installation Guide (Linux)

๐ŸŽฅ EvoX Installation Guide (Windows)

๐ŸŽฅ EvoX ๅฎ‰่ฃ…ๆŒ‡ๅ— (Linux)

๐ŸŽฅ EvoX ๅฎ‰่ฃ…ๆŒ‡ๅ— (Windows)

Quick Start

Kickstart your journey with EvoX in just a few simple steps:

  1. Import necessary modules:
import evox
from evox import algorithms, problems, workflows
  1. Configure an algorithm and define a problem:
pso = algorithms.PSO(
    lb=jnp.full(shape=(2,), fill_value=-32),
    ub=jnp.full(shape=(2,), fill_value=32),
    pop_size=100,
)
ackley = problems.numerical.Ackley()
  1. Compose and initialize the workflow:
workflow = workflows.StdWorkflow(pso, ackley)
key = jax.random.PRNGKey(42)
state = workflow.init(key)
  1. Run the workflow:
# Execute the workflow for 100 iterations
for i in range(100):
    state = workflow.step(state)

Use-cases and Applications

Try out ready-to-play examples in your browser with Colab:

Example Link
Basic Usage Open in Colab
Numerical Optimization Open in Colab
Neuroevolution with Gym Open in Colab
Neuroevolution with Brax Open in Colab
Custom Algorithm/Problem Open in Colab

For more use-cases and applications, pleae check out Example Directory.

Community & Support

Sister Projects

  • TensorNEAT: Tensorized NeuroEvolution of Augmenting Topologies (NEAT) for GPU Acceleration. Check out here.
  • TensorRVEA: Tensorized Reference Vector Guided Evolutionary Algorithm (RVEA) for GPU Acceleration. Check out here.
  • TensorACO: Tensorized Ant Colony Optimization (ACO) for GPU Acceleration. Check out here.
  • EvoXBench: A benchmark platform for Neural Architecutre Search (NAS) without the requirement of GPUs/PyTorch/Tensorflow, supporting various programming languages such as Java, Matlab, Python, ect. Check out here.

Citing EvoX

If you use EvoX in your research and want to cite it in your work, please use:

@article{evox,
  title = {{EvoX}: {A} {Distributed} {GPU}-accelerated {Framework} for {Scalable} {Evolutionary} {Computation}},
  author = {Huang, Beichen and Cheng, Ran and Li, Zhuozhao and Jin, Yaochu and Tan, Kay Chen},
  journal = {IEEE Transactions on Evolutionary Computation},
  year = 2024,
  doi = {10.1109/TEVC.2024.3388550}
}

Star History

Star History Chart