Ski Race

A skier begins at position \((x_0, y_0)\) and must ski through \(n\) gates to position \((x_{n+1}, y_{n+1})\), where skiing downhill is in the direction of increasing \(x\) position. The gates are parallel and run perpendicular to the \(x\) direction. For each \(x_i\) for \(i = 1, \cdots, n\) there is a gate centered at \(y_i\) with a width of \(c_i\).
Efficiently find the shortest path through all \(n\) gates.

Example

Consider the starting position \((x_0, y_0) = (0, 4)\) and ending position \((x_{n+1}, y_{n+1}) = (24, 4)\) and gates

\(i\) \(x_i\) \(y_i\) \(c_i\)
\(1\) \(4\) \(5\) \(3\)
\(2\) \(8\) \(4\) \(2\)
\(3\) \(12\) \(6\) \(2\)
\(4\) \(16\) \(5\) \(1\)
\(5\) \(20\) \(7\) \(2\)

Then the course looks like

ski course

Solution

Let \(p_i\) be the \(y\) position of the skier at \(x_i\). Then for the skier to go through the gate at \(x_i\), our constraint on \(p_i\) must be \(p_i \in [ y_i - \frac{c_i}{2} , y_i + \frac{c_i}{2} ]\).

The length of the shortest path taken between adjacent gates \(x_i\) and \(x_{i+1}\) is \(\sqrt{ (x_{i+1} - x_i)^2 + (p_{i+1} - p_i)^2 }\). Then we can formulate our optimization problem as

\[\begin{align*} & \underset{p}{\mathrm{minimize}} & & \sum_{i=0}^n \sqrt{ (x_{i+1} - x_i)^2 + (p_{i+1} - p_i)^2 } \\ & \text{subject to} & & p_0 = y_0 \\ & & & p_{n+1} = y_{n+1} \\ & & & p_i \geq y_i - \frac{c_i}{2} \quad i = 1, \cdots, n \\ & & & p_i \leq y_i + \frac{c_i}{2} \quad i = 1, \cdots, n \end{align*}\]

For simplicity, we can define \(p = \begin{bmatrix} p_0 \\ \vdots \\ p_{n+1} \end{bmatrix}\), \(y = \begin{bmatrix} y_0 \\ \vdots \\ y_{n+1} \end{bmatrix}\), and \(c = \begin{bmatrix} 0 \\ c_1 \\ \vdots \\ c_n \\ 0 \end{bmatrix}\) to write

\[\begin{align*} & \underset{p}{\mathrm{minimize}} & & \sum_{i=0}^n \sqrt{ (x_{i+1} - x_i)^2 + (p_{i+1} - p_i)^2 } \\ & \text{subject to} & & p \geq y - \frac{c}{2} \\ & & & p \leq y + \frac{c}{2} \end{align*}\]

This isn’t efficiently solvable yet, however, because the objective isn’t convex in \(p\). But we can make the problem convex with the following reduction:

Let \(d_i\) be \(p_i - p_{i-1}\), with \(d_0 = p_0 = y_0\), so \(d_1 = p_1 - p_0\), \(d_2 = p_2 - p_1\), and so on. Rearranging, this means \(p_i = \sum_{j=0}^i d_j\). Then we can neatly relate \(p\) and \(d\) by \(p = L d\), See np.tril where \(L\) is a lower triangular matrix with ones for entries. Then our optimization problem becomes

\[\begin{align*} & \underset{d}{\mathrm{minimize}} & & \sum_{i=1}^{n+1} \sqrt{ (x_{i} - x_{i-1})^2 + (d_i)^2 } \\ & \text{subject to} & & L d \geq y - \frac{c}{2} \\ & & & L d \leq y + \frac{c}{2} \end{align*}\]

Now the objective is the sum of the \(\ell_2\) norms of \(\begin{bmatrix} x_i - x_{i-1} \\ d_i \end{bmatrix}\), which is convex in \(d\). Also, the constraints are all affine, so this is a convex optimization problem.

We can apply our solution to the above values:

import numpy as np
from cvxpy import Variable, Minimize, Problem, norm, hstack

x = np.array([0, 4, 8, 12, 16, 20, 24])
y = np.array([4, 5, 4, 6, 5, 7, 4])
c = np.array([0, 3, 2, 2, 1, 2, 0])
L = np.tril(np.ones((7, 7)))

dy = Variable(7)

objective = sum(
    norm(hstack((x[i] - x[i-1], dy[i])))
    for i in range(1, 7)
)
constraints = [
    L * dy >= y - c / 2,
    L * dy <= y + c / 2,
]
problem = Problem(Minimize(objective), constraints)
problem.solve()

p = L @ dy.value
path = np.column_stack((x, p))
path_length = problem.value
print(f"Path length: {path_length}")
print("Path:")
print(path)
Path length: 24.573423483752443
Path:
[[ 0.          4.        ]
 [ 4.          4.37494654]
 [ 8.          4.749936  ]
 [12.          5.12493412]
 [16.          5.49999986]
 [20.          6.        ]
 [24.          4.        ]]

which we can visualize

solution