Definition:
- Goal: choose n-vector x so that k norm squared objectives, J1=∣∣A1x−b1∣∣2,...,Jk=∣∣Akx−bk∣∣2 are all small
- where Ai is an mi×n matrix, bi is an mi-vector for i=1,...,k
- Ji are the objectives in a multi-objective optimization problem
Weighted sum objective:
- Choose positive weights λ1,λ2,...,λk and form a weighted sum objective: J=λ1J1+...+λjJj=λ1∣∣A1x−b1∣∣2+...+λk∣∣Akx−bk∣∣2
- so we need to choose x to minimize J
- Interpretation of λi: how much we care about Ji being small relative to other Js
Weighted sum minimization via stacking:
- Write weighted-sum objective as J=λ1(A1x−b1)...λk(Akx−bk2=∣∣A~x−b~∣∣2
- A~=λ1A1...λkAk and b~=λ1b1...λkbk
Weighted sum solution:
- Assuming columns of A~ are independent, x~=(A~⊺A~)∣−1A~⊺b~=(λ1A1⊺A1+...+λkAk⊺Ak)−1(λ1A1⊺A1+...+λkAk⊺Ak)
- Then x^ can be computed with QR Decomposition of A~
Optimal trade-off curve:
- Graph the change of λ to see which J the model is prioritizing over the other
- For example: with bi-criterion problem J=J1+λJ2
- If J2 too big, increase λ so it will prioritize to lower J1 more
- If J1 too big, decrease λ so it will prioritize to lower J2 more
- Estmation and inversion: