MKLTwoStageRO.jl
is a Julia package for multiple kernel learning (MKL) aided two-stage robust optimization (TwoStageRO).
Generally, it solves the following TwoStageRO problem:
It includes data-driven MKL uncertain set construction [1, 2], and algorithms like MKL uncertainty set-induced column-and-constraint generation (MKLCCG) [2], as well as extended column-and-constraint generation (ECCG) [3], original (nested) column-and-constraint generation (CCG) [4, 5], and Benders-dual cutting plane method (BDCP) [4].
using Pkg
Pkg.add("MKLTwoStageRO")
Pkg.add("GLMakie") # if the user wants visualization
Then, using
the necessary packages:
using MKLTwoStageRO
using GLMakie
Consider the following location-transportation problem from [4]. To supply a commodity to customers, it will be first to stored at
In practice, the demand is uncertain before any facility is built and capacity is installed. However, in many cases, we can obtain some demand data (historical or experimental)
This problem can be modeled as the following two-stage robust optimization:
Here we instantiate the above problem with the following parameters:
f = [400, 414, 326]
a = [18, 25, 20]
c = [22 33; 33 23; 20 25]
K = [5, 5, 5]
Assume the following is the demand data we have collected. Note that each column of the training data corresponds to an observation of the input features.
d1 = 0.5 .+ 4 * rand(300)
d2 = 2 ./ d1 + 0.3 * randn(300) .+ 1
D = [d1'; d2']
mklocsvmplot(D; backend=GLMakie) # visualize the demand data
algor = HessianMKL(verbose=false)
U = mklocsvmtrain(D, 21; kernel="DNPNK", algorithm=algor, q_mode="randomly", ν=0.01, μ=0.05)
mklocsvmplot(U; backend=GLMakie) # visualize the MKL uncertainty set
Note:
kernel
: Generally, the user has two choices,"DPDK"
or"DNPNK"
.algorithm
: An instance ofHessianMKL
orQCQP
.q_mode
:"evenly"
or"randomly"
.ν
: It stands for an upper bound on the proportion of the outlies and hence controls the conservatism of the robust optimization.
Please see [1,2] and package MKLOneClassSVM.jl
for more information.
The user needs to organize the model into the general form and specify each component in it, thus establishing a TSROModel
.
a = [400, 414, 326, 18, 25, 20]
b = [22, 33, 20, 33, 23, 25]
E = [5 0 0 -1 0 0; 0 5 0 0 -1 0; 0 0 5 0 0 -1]
e = [0.0, 0.0, 0.0]
B = [-1.0 0.0 0.0 -1.0 0.0 0.0; 0.0 -1.0 0.0 0.0 -1.0 0.0; 0.0 0.0 -1.0 0.0 0.0 -1.0; 1.0 1.0 1.0 0.0 0.0 0.0; 0.0 0.0 0.0 1.0 1.0 1.0]
c = [0.0, 0.0, 0.0, 0.0, 0.0]
A = [0.0 0.0 0.0 1.0 0.0 0.0; 0.0 0.0 0.0 0.0 1.0 0.0; 0.0 0.0 0.0 0.0 0.0 1.0; 0.0 0.0 0.0 0.0 0.0 0.0; 0.0 0.0 0.0 0.0 0.0 0.0]
C = [0.0 0.0; 0.0 0.0; 0.0 0.0; -1.0 0.0; 0.0 -1.0]
Sx = [ZeroOne(3), Rplus(3)]
Sy = [Rplus(6)]
model = TSROModel(a, b, E, e, B, c, A, C, Sx, Sy, U)
mklccg = MKLCCG()
x_star, objv, recourse = solve(model; algorithm=mklccg)
û = [2.0, 3.0] # assume this is the uncertainty observation
y_star, RPobjv = recourse(û)
using Distributed
addprocs(3)
@everywhere using MKLTwoStageRO
Û = mklocsvmtrain(D, 21; kernel="DNPNK", algorithm=algor, q_mode="randomly", ν=0.01, μ=0.05, num_batch=3)
mklocsvmplot(Û; backend=GLMakie) # visualize the MKL uncertainty set
Then we can also build the TSROModel
and solve it:
model = TSROModel(a, b, E, e, B, c, A, C, Sx, Sy, Û)
x_star, objv, recourse = solve(model; algorithm=mklccg)
If there is an MKL-based uncertainty set U
, the user doesn't have to use MKLCCG algorithm to solve the model
(although it is recommended). There are other two-stage robust optimization algorithms available in this package for selection, such as the previously mentioned extended column-and-constraint generation (ECCG) [3], original (nested) column-and-constraint generation (CCG) [4, 5], and Benders-dual cutting plane method (BDCP) [4]. However, before calling these algorithms, it is necessary to first convert the MKL uncertain set into a JuMP
model expression and then build the model
again, e.g.,
Ū = convert_to_jumpmodel(U; form="linear", varname=:u)
model = TSROModel(a, b, E, e, B, c, A, C, Sx, Sy, Ū)
eccg = ECCG()
x_star, objv, recourse = solve(model; algorithm=eccg)
Note that different algorithms currently support slightly different types of problems, roughly as follows:
MKLCCG
:model.U
should be an MKL-based uncertainty set.model.Sx
is allowed to be a nonnegative real-valued space or a nonnegative mixed integer space, andmodel.Sy
should be a nonnegative real-valued space. The Feasibility assumption is necessary.ECCG
:model.U
can be modeled as amodel
fromJuMP.jl
or aPolyhedron
fromPolyhedra.jl
.model.Sx
is allowed to be a nonnegative real-valued space or a nonnegative mixed integer space, andmodel.Sy
should be a nonnegative real-valued space. The Feasibility assumption is necessary.CCG
:model.U
can be modeled as amodel
fromJuMP.jl
or aPolyhedron
fromPolyhedra.jl
.model.Sx
andmodel.Sy
are allowed to be nonnegative real-valued spaces or nonnegative mixed integer spaces. The Relatively Complete Recourse assumption is necessary whenmodel.Sy
is a nonnegative real-valued space, and the Extended Relatively Complete Recourse assumption is necessary whenmodel.Sy
is a nonnegative mixed integer space, andmodel.Sy
need have at least one real-valued dimension.BDCP
:model.U
can be modeled as amodel
fromJuMP.jl
or aPolyhedron
fromPolyhedra.jl
.model.Sx
is allowed to be a nonnegative real-valued space or a nonnegative mixed integer space, andmodel.Sy
should be a nonnegative real-valued space. The Relatively Complete Recourse assumption should hold.
The user can also construct uncertainty sets using the syntax of JuMP.jl
or Polyhedra.jl
, for example:
- Case-1:
c = [1, 2]; b = [3];
A = [4 5]; d = [6];
G = [1;;]; h = [2]; E = [-3 -4]; M = [-5 -6];
Sx = [Rplus()]; Sy = [Rplus(), ZeroOne()];
UncMod = Model()
u0 = [0, 0]
@variable(UncMod, u[1:2])
@variable(UncMod, -1 <= δ[1:2] <= 1)
@constraint(UncMod, u == u0 + δ)
model = TSROModel(c, b, A, d, G, h, E, M, Sy, Sx, UncMod)
ccg = CCG()
y_star, objv, recourse = solve(model; algorithm=ccg)
û = [0.5, -0.5]
x_star, RPobjv = recourse(û)
- Case-2:
c = [2]; b = [1];
A = [0;;]; d = [0];
G = [1;;]; h = [0]; E = [-1;;]; M = [-1;;];
Sy = [ZeroOne()]; Sx = [Rplus()];
UncSet = polyhedron(HalfSpace([1], 1) ∩ HalfSpace([-1], 0)) # 0 <= u <= 1
model = TSROModel(c, b, A, d, G, h, E, M, Sy, Sx, UncSet)
bdcp = BDCP()
y_star, objv, recourse = solve(model; algorithm=bdcp)
û = [0.5]
x_star, RPobjv = recourse(û)
If you find MKLTwoStageRO.jl
useful, we kindly request that you cite this repository and the following paper:
@article{Han2024Multiple,
title={Multiple kernel learning-aided column-and-constraint generation method},
author={Han, Biao},
year={2024},
publisher={optimization-online.org},
url={https://optimization-online.org/?p=28865},
}
By default, this package implicitly uses KernelFunctions.jl
, open source solvers HiGHS.jl
and Ipopt.jl
, and the single kernel SVM solver LIBSVM.jl
. Thanks for these useful packages, although the user is also allowed to replace them with other alternatives. In addition, many seniors in the Julia community gave many inspiring instructions, and the author would like to thank them.
- Han, B., Shang, C., & Huang, D. (2021). Multiple kernel learning-aided robust optimization: Learning algorithm, computational tractability, and usage in multi-stage decision-making. European Journal of Operational Research, 292(3), 1004-1018.
- Han, B. (2024). Multiple kernel learning-aided column-and-constraint generation method. available on Optimization-Online. org. https://optimization-online.org/?p=28865
- Bertsimas, D., & Shtern, S. (2018). A scalable algorithm for two-stage adaptive linear optimization. arXiv preprint arXiv:1807.02812.
- Zeng, B., & Zhao, L. (2013). Solving two-stage robust optimization problems using a column-and-constraint generation method. Operations Research Letters, 41(5), 457-461.
- Zhao, L., & Zeng, B. (2012). An exact algorithm for two-stage robust optimization with mixed integer recourse problems. submitted, available on Optimization-Online. org.