Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
-
Updated
Dec 22, 2024 - Python
Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
Distributed GPU-Accelerated Framework for Evolutionary Computation. Comprehensive Library of Evolutionary Algorithms & Benchmark Problems.
[***JMLR-2024***] PyPop7: A Pure-Python Library for POPulation-based Black-Box Optimization (BBO), especially *Large-Scale* versions/variants (e.g., evolutionary algorithms, swarm-based optimizers, pattern search, and random search, etc.). [Citation: https://jmlr.org/papers/v25/23-0386.html (***CCF-A***)]
ACL'2023: Multi-Task Pre-Training of Modular Prompt for Few-Shot Learning
Gradient-free optimization method for multivariable functions based on the low rank tensor train (TT) format and maximal-volume principle.
Gradient-free optimization method for the multidimensional arrays and discretized multivariate functions based on the tensor train (TT) format.
Tutorials for the optimization techniques used in Gradient-Free-Optimizers and Hyperactive.
Deep Neural Network Optimization Platform with Gradient-based, Gradient-Free Algorithms
Markov Chain Monte Carlo binary network optimization
🥭 MANGO: Maximization of neural Activation via Non-Gradient Optimization
EvoRBF: A Nature-inspired Algorithmic Framework for Evolving Radial Basis Function Networks
Gradient free reinforcement learning for PyTorch
A pure-MATLAB library of EVolutionary (population-based) OPTimization for Large-Scale black-box continuous Optimization (evopt-lso).
Gradient Free Reinforcement Learning solving Openai gym LunarLanderV2 by Evolution Strategy (Genetic Algorithm)
A pure-MATLAB library for POPulation-based Large-Scale Black-Box Optimization (pop-lsbbo).
Implementation code for the paper "Bayesian Optimization via Exact Penalty"
Implementation of smoothing-based optimization algorithms
Particle Swarm Optimiser
Zeroth order Frank Wolfe algorithm. Project for the Optimization for Data Science exam.
Black-box adversarial attacks on deep neural networks with tensor train (TT) decomposition and PROTES optimizer.
Add a description, image, and links to the gradient-free-optimization topic page so that developers can more easily learn about it.
To associate your repository with the gradient-free-optimization topic, visit your repo's landing page and select "manage topics."