Skip to content

Commit

Permalink
[docs] restructure conic optimization tutorials (#3822)
Browse files Browse the repository at this point in the history
  • Loading branch information
odow committed Sep 10, 2024
1 parent 9f23546 commit ceb8e54
Show file tree
Hide file tree
Showing 17 changed files with 99 additions and 106 deletions.
4 changes: 2 additions & 2 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -362,11 +362,11 @@ const _PAGES = [
],
"Conic programs" => [
"tutorials/conic/introduction.md",
"tutorials/conic/start_values.md",
"tutorials/conic/tips_and_tricks.md",
"tutorials/conic/simple_examples.md",
"tutorials/conic/dualization.md",
"tutorials/conic/arbitrary_precision.md",
"tutorials/conic/start_values.md",
"tutorials/conic/simple_examples.md",
"tutorials/conic/logistic_regression.md",
"tutorials/conic/experiment_design.md",
"tutorials/conic/min_ellipse.md",
Expand Down
6 changes: 3 additions & 3 deletions docs/src/changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -596,7 +596,7 @@ changes which might be breaking for a very small number of users.
- Clarified the documentation to say that matrices in [`HermitianPSDCone`](@ref)
must be `LinearAlgebra.Hermitian` (#3241)
- Minor style fixes to internal macro code (#3247)
- Add [Quantum state discrimination](@ref) tutorial (#3250)
- Add [Example: quantum state discrimination](@ref) tutorial (#3250)
- Improve error message when `begin...end` not passed to plural macros (#3255)
- Document how to register function with varying number of input arguments
(#3258)
Expand Down Expand Up @@ -653,12 +653,12 @@ changes which might be breaking for a very small number of users.

- Minor fixes to the documentation (#3200) (#3201) (#3203) (#3210)
- Added tutorial [Constraint programming](@ref) (#3202)
- Added more examples to [Tips and Tricks](@ref conic_tips_and_tricks)
- Added more examples to [Modeling with cones](@ref)
- Remove `_distance_to_set` in favor of `MOI.Utilities.distance_to_set` (#3209)
- Improve [The diet problem](@ref) tutorial by adding the variable as a column
in the dataframe (#3213)
- Improve [The knapsack problem example](@ref) tutorial (#3216) (#3217)
- Added the [Ellipsoid approximation](@ref) tutorial (#3218)
- Added the [Example: ellipsoid approximation](@ref) tutorial (#3218)

## Version 1.7.0 (January 25, 2023)

Expand Down
2 changes: 1 addition & 1 deletion docs/src/manual/complex.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ DocTestFilters = [r"≤|<=", r"≥|>=", r" == | = ", r" ∈ | in ", r"MathOptInt

This page explains the complex-valued variables and constraints that JuMP
supports. For a worked-example using these features, read the
[Quantum state discrimination](@ref) tutorial.
[Example: quantum state discrimination](@ref) tutorial.

## Complex-valued variables

Expand Down
2 changes: 1 addition & 1 deletion docs/src/manual/models.md
Original file line number Diff line number Diff line change
Expand Up @@ -655,7 +655,7 @@ MOIU.CachingOptimizer
Bridges can be added and removed from a [`MOI.Bridges.LazyBridgeOptimizer`](@ref)
using [`add_bridge`](@ref) and [`remove_bridge`](@ref). Use
[`print_active_bridges`](@ref) to see which bridges are used to reformulate the
model. Read the [Ellipsoid approximation](@ref) tutorial for more details.
model. Read the [Example: ellipsoid approximation](@ref) tutorial for more details.

### Unsafe backend

Expand Down
2 changes: 1 addition & 1 deletion docs/src/tutorials/applications/optimal_power_flow.jl
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
# matrix cones such as the [`HermitianPSDCone`](@ref) object.

# For another example of modeling with complex decision variables, see the
# [Quantum state discrimination](@ref) tutorial, and see the
# [Example: quantum state discrimination](@ref) tutorial, and see the
# [Complex number support](@ref) section of the manual for more details.

# This tutorial takes a matrix-oriented approach focused on network nodes
Expand Down
2 changes: 2 additions & 0 deletions docs/src/tutorials/conic/arbitrary_precision.jl
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@
# The purpose of this tutorial is to explain how to use a solver which supports
# arithmetic using a number type other than `Float64`.

# ## Required packages

# This tutorial uses the following packages:

using JuMP
Expand Down
9 changes: 6 additions & 3 deletions docs/src/tutorials/conic/dualization.jl
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,9 @@
# # Dualization

# The purpose of this tutorial is to explain how to use [Dualization.jl](@ref) to
# improve the performance of some conic optimization models. There are two
# important takeaways:
# improve the performance of some conic optimization models.

# There are two important takeaways:
#
# 1. JuMP reformulates problems to meet the input requirements of the
# solver, potentially increasing the problem size by adding slack variables
Expand All @@ -18,7 +19,9 @@
# [Dualization.jl](@ref) is a package which fixes these problems, allowing you
# to solve the dual instead of the primal with a one-line change to your code.

# This tutorial uses the following packages
# ## Required packages

# This tutorial uses the following packages:

using JuMP
import Dualization
Expand Down
26 changes: 13 additions & 13 deletions docs/src/tutorials/conic/ellipse_approx.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# v.2.0. If a copy of the MPL was not distributed with this file, You can #src
# obtain one at https://mozilla.org/MPL/2.0/. #src

# # Ellipsoid approximation
# # Example: ellipsoid approximation

# This tutorial considers the problem of computing _extremal ellipsoids_:
# finding ellipsoids that best approximate a given set. As an extension, we show
Expand All @@ -12,7 +12,18 @@

# The model comes from Section 4.9 of [BenTal2001](@cite).

# For a related example, see also the [Minimal ellipses](@ref) tutorial.
# For a related example, see also the [Example: minimal ellipses](@ref) tutorial.

# ## Required packages

# This tutorial uses the following packages:

using JuMP
import LinearAlgebra
import Plots
import Random
import SCS
import Test

# ## Problem formulation

Expand Down Expand Up @@ -40,17 +51,6 @@
# ```
# where ``D = Z_*`` and ``c = Z_*^{-1} z_*``.

# ## Required packages

# This tutorial uses the following packages:

using JuMP
import LinearAlgebra
import Plots
import Random
import SCS
import Test

# ## Data

# We first need to generate some points to work with.
Expand Down
17 changes: 6 additions & 11 deletions docs/src/tutorials/conic/experiment_design.jl
Original file line number Diff line number Diff line change
Expand Up @@ -18,28 +18,23 @@
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE #src
# SOFTWARE. #src

# # Experiment design
# # Example: experiment design

# **This tutorial was originally contributed by Arpit Bhatia and Chris Coey.**

# This tutorial covers experiment design examples (D-optimal, A-optimal, and
# E-optimal) from section 7.5 of [Boyd2004](@cite).

# The tutorial uses the following packages
# ## Required packages

# This tutorial uses the following packages:

using JuMP
import SCS
import LinearAlgebra
import MathOptInterface as MOI
import Random

# !!! info
# This tutorial uses sets from [MathOptInterface](@ref moi_documentation).
# By default, JuMP exports the `MOI` symbol as an alias for the
# MathOptInterface.jl package. We recommend making this more explicit in
# your code by adding the following lines:
# ```julia
# import MathOptInterface as MOI
# ```

# We set a seed so the random numbers are repeatable:
Random.seed!(1234)

Expand Down
11 changes: 5 additions & 6 deletions docs/src/tutorials/conic/introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,12 +39,11 @@ will help you know where to look for certain things.
then formulate it in mathematics, and then solve it in JuMP. This usually
involves some sort of visualization of the solution. Start here if you are
new to JuMP.
* [Experiment design](@ref)
* [Logistic regression](@ref)
* The [Tips and tricks](@ref conic_tips_and_tricks) tutorial contains a
number of helpful reformulations and tricks you can use when modeling
conic programs. Look here if you are stuck trying to formulate a problem
as a conic program.
* [Example: experiment design](@ref)
* [Example: logistic regression](@ref)
* The [Modeling with cones](@ref) tutorial contains a number of helpful
reformulations and tricks you can use when modeling conic programs. Look here
if you are stuck trying to formulate a problem as a conic program.
* The remaining tutorials are less verbose and styled in the form of short code
examples. These tutorials have less explanation, but may contain useful
code snippets, particularly if they are similar to a problem you are trying
Expand Down
44 changes: 18 additions & 26 deletions docs/src/tutorials/conic/logistic_regression.jl
Original file line number Diff line number Diff line change
Expand Up @@ -20,19 +20,27 @@
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE #src
# SOFTWARE. #src

# # Logistic regression
# # Example: logistic regression

# **This tutorial was originally contributed by François Pacaud.**

# This tutorial shows how to solve a logistic regression problem
# with JuMP. Logistic regression is a well known method in machine learning,
# useful when we want to classify binary variables with the help of
# a given set of features. To this goal,
# we find the optimal combination of features maximizing
# the (log)-likelihood onto a training set. From a modern optimization glance,
# the resulting problem is convex and differentiable. On a modern optimization
# glance, it is even conic representable.
#
# This tutorial shows how to solve a logistic regression problem with JuMP.
# Logistic regression is a well known method in machine learning, useful when we
# want to classify binary variables with the help of a given set of features. To
# this goal, we find the optimal combination of features maximizing the
# (log)-likelihood onto a training set.

# ## Required packages

# This tutorial uses the following packages:

using JuMP
import MathOptInterface as MOI
import Random
import SCS

Random.seed!(2713);

# ## Formulating the logistic regression problem
#
# Suppose we have a set of training data-point $i = 1, \cdots, n$, where
Expand Down Expand Up @@ -122,22 +130,6 @@
# Thus, if $n \gg 1$, we get a large number of variables and constraints.

# ## Fitting logistic regression with a conic solver
#
# It is now time to pass to the implementation. We choose SCS as a conic solver.
using JuMP
import Random
import SCS

# !!! info
# This tutorial uses sets from [MathOptInterface](@ref moi_documentation).
# By default, JuMP exports the `MOI` symbol as an alias for the
# MathOptInterface.jl package. We recommend making this more explicit in
# your code by adding the following lines:
# ```julia
# import MathOptInterface as MOI
# ```

Random.seed!(2713);

# We start by implementing a function to generate a fake dataset, and where
# we could tune the correlation between the feature variables. The function
Expand Down
2 changes: 1 addition & 1 deletion docs/src/tutorials/conic/min_ellipse.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# v.2.0. If a copy of the MPL was not distributed with this file, You can #src
# obtain one at https://mozilla.org/MPL/2.0/. #src

# # Minimal ellipses
# # Example: minimal ellipses

# This example comes from section 8.4.1 of the book *Convex Optimization* by
# [Boyd and Vandenberghe (2004)](https://web.stanford.edu/~boyd/cvxbook/).
Expand Down
4 changes: 2 additions & 2 deletions docs/src/tutorials/conic/quantum_discrimination.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# v.2.0. If a copy of the MPL was not distributed with this file, You can #src
# obtain one at https://mozilla.org/MPL/2.0/. #src

# # Quantum state discrimination
# # Example: quantum state discrimination

# This tutorial solves the problem of [quantum state discrimination](https://en.wikipedia.org/wiki/Quantum_state_discrimination).

Expand All @@ -13,7 +13,7 @@

# ## Required packages

# This tutorial makes use of the following packages:
# This tutorial uses the following packages:

using JuMP
import LinearAlgebra
Expand Down
10 changes: 6 additions & 4 deletions docs/src/tutorials/conic/simple_examples.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,12 @@

# # Simple semidefinite programming examples

# This tutorial is a collection of examples of small conic programs from the field of
# [semidefinite programming](https://en.wikipedia.org/wiki/Semidefinite_programming) (SDP).
#
# This tutorial makes use of the following packages:
# The purpose of this tutorial is to provide a collection of examples of small
# conic programs from the field of [semidefinite programming](https://en.wikipedia.org/wiki/Semidefinite_programming) (SDP).

# ## Required packages

# This tutorial uses the following packages:

using JuMP
import LinearAlgebra
Expand Down
24 changes: 15 additions & 9 deletions docs/src/tutorials/conic/start_values.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,23 +9,24 @@
# solution. This can improve performance, particularly if you are repeatedly
# solving a sequence of related problems.

# The purpose of this tutorial is to demonstrate how to write a function that
# sets the primal and dual starts as the optimal solution stored in a model. It
# is intended to be a starting point for which you can modify if you want to do
# something similar in your own code.

# !!! tip
# See [`set_start_values`](@ref) for a generic implementation of this
# function that was added to JuMP after this tutorial was written.

# In this tutorial, we demonstrate how to write a function that sets the primal
# and dual starts as the optimal solution stored in a model. It is intended to
# be a starting point for which you can modify if you want to do something
# similar in your own code.

# !!! warning
# This tutorial does not set start values for nonlinear models.
# ## Required packages

# This tutorial uses the following packages:

using JuMP
import SCS

# ## A basic function

# The main component of this tutorial is the following function. The most
# important observation is that we cache all of the solution values first, and
# then we modify the model second. (Alternating between querying a value and
Expand Down Expand Up @@ -60,6 +61,8 @@ function set_optimal_start_values(model::Model)
return
end

# ## Testing the function

# To test our function, we use the following linear program:

model = Model(SCS.Optimizer)
Expand All @@ -81,8 +84,11 @@ optimize!(model)
# Now the optimization terminates after 0 iterations because our starting point
# is already optimal.

# Note that some solvers do not support setting some parts of the starting
# solution, for example, they may support only `set_start_value` for variables.
# ## Caveats

# Some solvers do not support setting some parts of the starting solution, for
# example, they may support only `set_start_value` for variables.

# If you encounter an `UnsupportedSupported` attribute error for
# [`MOI.VariablePrimalStart`](@ref), [`MOI.ConstraintPrimalStart`](@ref), or
# [`MOI.ConstraintDualStart`](@ref), comment out the corresponding part of the
Expand Down
Loading

0 comments on commit ceb8e54

Please sign in to comment.