Skip to content

A Julia package for multiple kernel learning aided two-stage robust optimization.

License

Notifications You must be signed in to change notification settings

hanb16/MKLTwoStageRO.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MKLTwoStageRO.jl

MKLTwoStageRO.jl is a Julia package for multiple kernel learning (MKL) aided two-stage robust optimization (TwoStageRO).

Generally, it solves the following TwoStageRO problem:

It includes data-driven MKL uncertain set construction [1, 2], and algorithms like MKL uncertainty set-induced column-and-constraint generation (MKLCCG) [2], as well as extended column-and-constraint generation (ECCG) [3], original (nested) column-and-constraint generation (CCG) [4, 5], and Benders-dual cutting plane method (BDCP) [4].

Package Installation

using Pkg
Pkg.add("MKLTwoStageRO")
Pkg.add("GLMakie") # if the user wants visualization

Then, using the necessary packages:

using MKLTwoStageRO
using GLMakie

A Toy Example: Two-Stage Robust Location-Transportation Problem

Consider the following location-transportation problem from [4]. To supply a commodity to customers, it will be first to stored at $m$ potential facilities and then be transported to $n$ customers. The fixed cost of the building facilities at site $i$ is $f_i$ and the unit capacity cost is $a_i$ for $i=1,\cdots,m$. The demand is $d_j$ for $j=1,\cdots,n$, and the unit transportation cost between $i$ and $j$ is $c_{ij}$ for $i$ - $j$ pair. The maximal allowable capacity of the facility at site $i$ is $K_i$ and $\sum_iK_i\geq\sum_jd_j$ ensures feasibility. Let $y_i\in\{0,1\}$ be the facility location variable, $z_i\in\mathbb{R}_+$ be the capacity variable, and $x_{ij}\in\mathbb{R}_+$ be the transportation variable.

In practice, the demand is uncertain before any facility is built and capacity is installed. However, in many cases, we can obtain some demand data (historical or experimental) $\mathcal{D}$ through market research and other methods. The question is how to make robust decisions based on these data.

This problem can be modeled as the following two-stage robust optimization:

Here we instantiate the above problem with the following parameters:

f = [400, 414, 326]
a = [18, 25, 20]
c = [22 33; 33 23; 20 25]
K = [5, 5, 5]

Collecting uncertain data

Assume the following is the demand data we have collected. Note that each column of the training data corresponds to an observation of the input features.

d1 = 0.5 .+ 4 * rand(300)
d2 = 2 ./ d1 + 0.3 * randn(300) .+ 1
D = [d1'; d2']
mklocsvmplot(D; backend=GLMakie) # visualize the demand data

Constructing MKL-based uncertainty set

algor = HessianMKL(verbose=false)
U = mklocsvmtrain(D, 21; kernel="DNPNK", algorithm=algor, q_mode="randomly", ν=0.01, μ=0.05)
mklocsvmplot(U; backend=GLMakie) # visualize the MKL uncertainty set

Note:

  1. kernel: Generally, the user has two choices, "DPDK" or "DNPNK".
  2. algorithm: An instance of HessianMKL or QCQP.
  3. q_mode: "evenly" or "randomly".
  4. ν: It stands for an upper bound on the proportion of the outlies and hence controls the conservatism of the robust optimization.

Please see [1,2] and package MKLOneClassSVM.jl for more information.

Modeling the problem

The user needs to organize the model into the general form and specify each component in it, thus establishing a TSROModel.

a = [400, 414, 326, 18, 25, 20]
b = [22, 33, 20, 33, 23, 25]
E = [5 0 0 -1 0 0; 0 5 0 0 -1 0; 0 0 5 0 0 -1]
e = [0.0, 0.0, 0.0]
B = [-1.0 0.0 0.0 -1.0 0.0 0.0; 0.0 -1.0 0.0 0.0 -1.0 0.0; 0.0 0.0 -1.0 0.0 0.0 -1.0; 1.0 1.0 1.0 0.0 0.0 0.0; 0.0 0.0 0.0 1.0 1.0 1.0]
c = [0.0, 0.0, 0.0, 0.0, 0.0]
A = [0.0 0.0 0.0 1.0 0.0 0.0; 0.0 0.0 0.0 0.0 1.0 0.0; 0.0 0.0 0.0 0.0 0.0 1.0; 0.0 0.0 0.0 0.0 0.0 0.0; 0.0 0.0 0.0 0.0 0.0 0.0]
C = [0.0 0.0; 0.0 0.0; 0.0 0.0; -1.0 0.0; 0.0 -1.0]
Sx = [ZeroOne(3), Rplus(3)]
Sy = [Rplus(6)]
model = TSROModel(a, b, E, e, B, c, A, C, Sx, Sy, U)

Solving the model using MKLCCG algorithm

mklccg = MKLCCG()
x_star, objv, recourse = solve(model; algorithm=mklccg)

Retrieving the recourse decision

= [2.0, 3.0] # assume this is the uncertainty observation
y_star, RPobjv = recourse(û)

Other Features

The MKL uncertainty set can be trained distributedly

using Distributed
addprocs(3)
@everywhere using MKLTwoStageRO

Û = mklocsvmtrain(D, 21; kernel="DNPNK", algorithm=algor, q_mode="randomly", ν=0.01, μ=0.05, num_batch=3)
mklocsvmplot(Û; backend=GLMakie) # visualize the MKL uncertainty set

Then we can also build the TSROModel and solve it:

model = TSROModel(a, b, E, e, B, c, A, C, Sx, Sy, Û)
x_star, objv, recourse = solve(model; algorithm=mklccg)

Other algorithms can also be used

If there is an MKL-based uncertainty set U, the user doesn't have to use MKLCCG algorithm to solve the model (although it is recommended). There are other two-stage robust optimization algorithms available in this package for selection, such as the previously mentioned extended column-and-constraint generation (ECCG) [3], original (nested) column-and-constraint generation (CCG) [4, 5], and Benders-dual cutting plane method (BDCP) [4]. However, before calling these algorithms, it is necessary to first convert the MKL uncertain set into a JuMP model expression and then build the model again, e.g.,

= convert_to_jumpmodel(U; form="linear", varname=:u)
model = TSROModel(a, b, E, e, B, c, A, C, Sx, Sy, Ū)
eccg = ECCG()
x_star, objv, recourse = solve(model; algorithm=eccg)

Note that different algorithms currently support slightly different types of problems, roughly as follows:

  • MKLCCG: model.U should be an MKL-based uncertainty set. model.Sx is allowed to be a nonnegative real-valued space or a nonnegative mixed integer space, and model.Sy should be a nonnegative real-valued space. The Feasibility assumption is necessary.
  • ECCG: model.U can be modeled as a model from JuMP.jl or a Polyhedron from Polyhedra.jl. model.Sx is allowed to be a nonnegative real-valued space or a nonnegative mixed integer space, and model.Sy should be a nonnegative real-valued space. The Feasibility assumption is necessary.
  • CCG: model.U can be modeled as a model from JuMP.jl or a Polyhedron from Polyhedra.jl. model.Sx and model.Sy are allowed to be nonnegative real-valued spaces or nonnegative mixed integer spaces. The Relatively Complete Recourse assumption is necessary when model.Sy is a nonnegative real-valued space, and the Extended Relatively Complete Recourse assumption is necessary when model.Sy is a nonnegative mixed integer space, and model.Sy need have at least one real-valued dimension.
  • BDCP: model.U can be modeled as a model from JuMP.jl or a Polyhedron from Polyhedra.jl. model.Sx is allowed to be a nonnegative real-valued space or a nonnegative mixed integer space, and model.Sy should be a nonnegative real-valued space. The Relatively Complete Recourse assumption should hold.

The uncertainty set doesn't have to be an MKL-based one

The user can also construct uncertainty sets using the syntax of JuMP.jl or Polyhedra.jl, for example:

  • Case-1:
c = [1, 2]; b = [3];
A = [4 5]; d = [6]; 
G = [1;;]; h = [2]; E = [-3 -4]; M = [-5 -6];
Sx = [Rplus()]; Sy = [Rplus(), ZeroOne()];
UncMod = Model()
u0 = [0, 0]
@variable(UncMod, u[1:2])
@variable(UncMod, -1 <= δ[1:2] <= 1)
@constraint(UncMod, u == u0 + δ)
model = TSROModel(c, b, A, d, G, h, E, M, Sy, Sx, UncMod)
ccg = CCG()
y_star, objv, recourse = solve(model; algorithm=ccg)
û = [0.5, -0.5]
x_star, RPobjv = recourse(û)
  • Case-2:
c = [2]; b = [1];
A = [0;;]; d = [0]; 
G = [1;;]; h = [0]; E = [-1;;]; M = [-1;;];
Sy = [ZeroOne()]; Sx = [Rplus()]; 
UncSet = polyhedron(HalfSpace([1], 1)  HalfSpace([-1], 0)) # 0 <= u <= 1
model = TSROModel(c, b, A, d, G, h, E, M, Sy, Sx, UncSet)
bdcp = BDCP()
y_star, objv, recourse = solve(model; algorithm=bdcp)
û = [0.5]
x_star, RPobjv = recourse(û)

Citing

If you find MKLTwoStageRO.jl useful, we kindly request that you cite this repository and the following paper:

@article{Han2024Multiple,
  title={Multiple kernel learning-aided column-and-constraint generation method},
  author={Han, Biao},
  year={2024},
  publisher={optimization-online.org},
  url={https://optimization-online.org/?p=28865},
}

Acknowledgments

By default, this package implicitly uses KernelFunctions.jl, open source solvers HiGHS.jl and Ipopt.jl, and the single kernel SVM solver LIBSVM.jl. Thanks for these useful packages, although the user is also allowed to replace them with other alternatives. In addition, many seniors in the Julia community gave many inspiring instructions, and the author would like to thank them.

References

  1. Han, B., Shang, C., & Huang, D. (2021). Multiple kernel learning-aided robust optimization: Learning algorithm, computational tractability, and usage in multi-stage decision-making. European Journal of Operational Research, 292(3), 1004-1018.
  2. Han, B. (2024). Multiple kernel learning-aided column-and-constraint generation method. available on Optimization-Online. org. https://optimization-online.org/?p=28865
  3. Bertsimas, D., & Shtern, S. (2018). A scalable algorithm for two-stage adaptive linear optimization. arXiv preprint arXiv:1807.02812.
  4. Zeng, B., & Zhao, L. (2013). Solving two-stage robust optimization problems using a column-and-constraint generation method. Operations Research Letters, 41(5), 457-461.
  5. Zhao, L., & Zeng, B. (2012). An exact algorithm for two-stage robust optimization with mixed integer recourse problems. submitted, available on Optimization-Online. org.

About

A Julia package for multiple kernel learning aided two-stage robust optimization.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages