Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorrect results when kron, subtraction, and matrix multiplication of complex variables involved #510

Closed
not-fred opened this issue Aug 3, 2023 · 0 comments · Fixed by #522
Labels

Comments

@not-fred
Copy link

not-fred commented Aug 3, 2023

The minimisation of the operator norm (maximum singular value)
$$\min_\rho ||U(KK^\dagger - I\otimes\rho)U^\dagger||, \text{ s.t. } \text{tr}(\rho)=1, \rho \succeq 0$$
is incorrect when both $U$ and $KK^\dagger$ are complex. Here $I$ is the identity and $U$ is some unitary.

A few notes

  • Optimal value reported by Convex and evaluated values also do not agree
  • Bug occurs with both SCS and COSMO
  • Bug does not occur with JuMP
  • This was the simplest optimisation I found which still exhibited the problem. Elements needed are:
    • Kron between complex variable and fixed value
    • Subtraction or addition of the above with a fixed value
    • Multiplication with another matrix

Minimal example:

using Convex, SCS, LinearAlgebra

function opt_Convex(U,K;d₁,d₂)
    ρ = Semidefinite(d₂)
    t = Variable()
    J = U*(K*K' - kron(I(d₁),ρ))*U'

    constraints = [
        tr(ρ) == 1,
        t*I(d₁*d₂)-J  0,
        t*I(d₁*d₂)+J  0,
    ]

    optimizer = SCS.Optimizer()
    Convex.MOI.set(optimizer, Convex.MOI.RawOptimizerAttribute("eps_abs"), 1E-9)
    Convex.MOI.set(optimizer, Convex.MOI.RawOptimizerAttribute("eps_rel"), 1E-12)

    problem = minimize(t, constraints)
    solve!(problem, optimizer)

    return problem.optval, opnorm(evaluate(J), 2)
end

d₁,d₂ = 2,2;

# K[1] is purely real, K[2] is purely imaginary, K[3] is complex
K = Any[(reshape(1:(d₁*d₂)^2,d₁*d₂,d₁*d₂) + im*reshape((d₁*d₂)^2+1:2*(d₁*d₂)^2,d₁*d₂,d₁*d₂)) for _  1:3];
# K = Any[reshape(1:d₂^2,d₂,d₂) + im*reshape(d₂^2+1:2*d₂^2,d₂,d₂) for _ ∈ 1:3];
K[3] *= sqrt(d₁/tr(K[3]'K[3]));
K[1] = real.(K[3]);
K[1] *= sqrt(d₁/tr(K[1]'K[1]));
K[2] = K[1]*im;

# The unitaries are identities with complex phases
# U[1] is purely real, U[2] is purely imaginary, U[3] is complex
U = Any[I(d₁*d₂) for _  1:3];
U[2] = U[1]*im;
U[3] = U[1]*exp(im*π/4);

# Each column should be the same because the operator norm
# is unitarily invariant. Here you’ll find a discrepancy, but only
# at Δarr_Convex[3,3,:]
Δarr_Convex = zeros(3,3,2);
for j  1:3, k  1:3
    Δarr_Convex[j,k,:] .= opt_Convex(U[j],K[k];d₁,d₂);
end

The output is

julia> Δarr_Convex
3×3×2 Array{Float64, 3}:
[:, :, 1] =
 0.994868  0.994868  0.999863
 0.994868  0.994868  0.999863
 0.994868  0.994868  0.999712

[:, :, 2] =
 0.994868  0.994868  0.999863
 0.994868  0.994868  0.999863
 0.994868  0.994868  0.999906

The optimisation is incorrect when both $U$ and $KK^\dagger$ are complex, and the reported optimal value do not agree with the actual evaluated operator norm either.

For reference, here is the code with JuMP

function opt_JuMP(U,K;d₁,d₂)
    model = Model(SCS.Optimizer)
    @variable(model, ρ[1:d₂,1:d₂]  HermitianPSDCone())
    @variable(model, t)
    J = U*(K*K' - kron(I(d₁),ρ))*U'

    @constraint(model, tr(ρ)==1)
    @constraint(model, Hermitian(t*I(d₁*d₂)-J)  HermitianPSDCone())
    @constraint(model, Hermitian(t*I(d₁*d₂)+J)  HermitianPSDCone())

    MOI.set(model, MOI.RawOptimizerAttribute("eps_abs"), 1E-9)
    MOI.set(model, MOI.RawOptimizerAttribute("eps_rel"), 1E-12)

    @objective(model, Min, t)
    optimize!(model)

    return objective_value(model), opnorm(value.(J), 2)
end

Δarr_JuMP = zeros(3,3,2);
for j  1:3, k  1:3
    Δarr_JuMP[j,k,:] .= opt_JuMP(U[j],K[k];d₁,d₂);
end

With the output

julia> Δarr_JuMP
3×3×2 Array{Float64, 3}:
[:, :, 1] =
 0.994868  0.994868  0.99965
 0.994868  0.994868  0.99965
 0.994868  0.994868  0.99965

[:, :, 2] =
 0.994868  0.994868  0.99965
 0.994868  0.994868  0.99965
 0.994868  0.994868  0.99965
@ericphanson ericphanson added the bug label Aug 3, 2023
@odow odow closed this as completed in #522 Dec 29, 2023
odow pushed a commit that referenced this issue Dec 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Development

Successfully merging a pull request may close this issue.

2 participants