SLAM Engineer

๐ŸŒˆ [SymForce Tutorial 4ํŽธ] Robust Optimization Tutorial


Robust Optimization ์ด๋ž€?

  • ํ˜„์‹ค ์„ธ๊ณ„์—์„œ๋Š” ๋‹ค์–‘ํ•œ ์ด์œ ๋กœ ์ธํ•ด False Correspondences ๊ฐ€ ์กด์žฌํ•œ๋‹ค.
  • ๋”ฐ๋ผ์„œ ์ด๋Ÿฐ ์ž˜๋ชป๋œ constraint ๋กœ ์ธํ•ด, ์ „์ฒด ํ•ด(solution)๊ฐ€ ๋ง๊ฐ€์งˆ ์ˆ˜ ์žˆ๋‹ค.
  • ์ด๋Ÿฐ ์ƒํ™ฉ์—์„œ๋„ ๊ฐ•๊ฑดํ•˜๊ฒŒ ์ตœ์ ํ™”๋ฅผ ์ˆ˜ํ–‰ํ•  ์ˆ˜ ์žˆ๋Š” ๋ฐฉ๋ฒ•์— ๋Œ€ํ•ด ์•Œ์•„๋ณด๊ณ  ์‹ค์Šตํ•ด๋ณด์ž.

Robust Optimization์˜ ํ•„์š”์„ฑ

Recap

  • ์ง€๋‚œ ํฌ์ŠคํŠธ (Sim(3) ICP, Pose-graph Optimization)์— ์ด์–ด ๊ณ„์†ํ•ด์„œ, Robotics ์—์„œ์˜ Nonlinear Optimization ์„ ํ’€์–ด๋ณด๋Š” ์‹ค์Šต์„ ํ•ด๋ณด์ž.
    • SymForce ๋ฅผ ์ด์šฉํ•ด์„œ!
    • ps. Robotics ์—์„œ ์–ด๋–ค (Non)-linear problem ๋“ค์„ Least square optimization ์œผ๋กœ ์–ด๋–ป๊ฒŒ ํ‘ธ๋Š”์ง€์— ๊ด€ํ•ด์„œ๋Š” Grisetti ๊ต์ˆ˜๋‹˜์˜ Least squares optimization: From theory to practice ๋…ผ๋ฌธ์„ ์ฝ์–ด๋ณด๊ธฐ๋ฅผ ์ถ”์ฒœํ•จ.

์ด์ƒ vs ํ˜„์‹ค

  • ์ด์ƒ: ๊ทธ๋Ÿฐ๋ฐ ์•ž์˜ ์˜ˆ์ œ๋“ค์—์„œ๋Š” True correspondence ๋ฅผ ๊ฐ€์ •ํ–ˆ์—ˆ๋‹ค.
  • ํ˜„์‹ค: ํ•˜์ง€๋งŒ ํ˜„์‹ค ์„ธ๊ณ„์—์„œ๋Š”, ์„ผ์„œ ๋…ธ์ด์ฆˆ, ์œ ์‚ฌํ•œ ์žฅ์†Œ, ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ํ•œ๊ณ„ ๋“ฑ ๋‹ค์–‘ํ•œ ์ด์œ ๋กœ ์ธํ•ด False correspondence๊ฐ€ ์ƒ๊ธฐ๋Š” ๊ฒƒ์„ ํ”ผํ•  ์ˆ˜ ์—†๋‹ค.
    • ์˜ˆ๋ฅผ ๋“ค๋ฉด ์•„๋ž˜ ๊ทธ๋ฆผ๊ณผ ๊ฐ™์€ ์ƒํ™ฉ์ด ๋ฐœ์ƒํ•  ์ˆ˜ ์žˆ๋‹ค. (Full ํŠœํ† ๋ฆฌ์–ผ ์˜์ƒ์€ ์—ฌ๊ธฐ์„œ ๋ณผ ์ˆ˜ ์žˆ๋‹ค)

      1. ์—ฌ๊ธฐ์„œ ์ œ์ผ ์™ผ์ชฝ ๊ทธ๋ฆผ์—์„œ, ์ดˆ๋ก์ƒ‰ ์„ ์€ True correspondence ๋ฅผ ์˜๋ฏธํ•˜๊ณ , ๋นจ๊ฐ•์ƒ‰ pair lines ๋Š” False correspondences ๋ฅผ ์˜๋ฏธํ•œ๋‹ค.
        • ์ฆ‰, ์ดˆ๋ก pair ๋Š” ์‹ค์ œ๋กœ ์˜๋ฏธ์ ์œผ๋กœ ๋™์ผํ•œ ๋ถ€๋ถ„์ด๋ฉฐ, ๋นจ๊ฐ• pair๋Š” ์‹ค์ œ๋กœ ๋‹ค๋ฅธ ๋ถ€๋ถ„์ด๊ธฐ ๋•Œ๋ฌธ์— ์ด ๋‘ ๋ถ€๋ถ„์ด ๊ฐ™์•„์ ธ์„œ๋Š” ์•ˆ๋œ๋‹ค.
      2. ํ•˜์ง€๋งŒ Solver ์ž…์žฅ์—์„œ๋Š” ์–ด๋–ค pair ๊ฐ€ true ์ด๊ณ  false ์ธ์ง€ ์•Œ ์ˆ˜ ์—†๊ธฐ ๋•Œ๋ฌธ์—, ์ด๋“ค ๋ชจ๋“  constraints ๋ฅผ ๋™์‹œ์— ๋งŒ์กฑํ•˜๋ ค๊ณ  ํ•˜๋‹ค๋ณด๋ฉด, solution์ด ๋ง๊ฐ€์ง„๋‹ค. ์ค‘์•™์˜ ๊ทธ๋ฆผ์€ ๊ทธ๋Ÿฐ ์˜ˆ์‹œ์ด๋‹ค.
        • ps. ์—ฌ๊ธฐ์„œ๋Š” ๊ณผ์žฅํ•ด์„œ ์ ˆ๋ฐ˜ (50%) ์ด outlier ์ธ ์ƒํ™ฉ์„ ๊ฐ€์ •ํ–ˆ์ง€๋งŒ, ๋ณธ ํŠœํ† ๋ฆฌ์–ผ์—์„œ ์„ค๋ช…ํ•˜๋Š” robust kernel ์„ ์ ์šฉํ•˜์ง€ ์•Š์œผ๋ฉด, ๋‹จ 5%์˜ outlier ๋งŒ ์žˆ๋”๋ผ๋„ ํ•ด๊ฐ€ ์ˆ˜๋ ดํ•˜์ง€ ์•Š์„ ์ˆ˜ ์žˆ๋‹ค (๋’ค์—์„œ ์‹ค์Šต์—์„œ ์•Œ์•„๋ณผ ์˜ˆ์ •).
          • ps2. ์‚ฌ์‹ค ์‹ค์ œ์ƒํ™ฉ์—์„œ 50% ๊นŒ์ง€๋Š” low outlier ๋ผ๊ณ  ์—ฌ๊ฒจ์ง€๋Š” ๋“ฏ.
            • Robotics ์—์„œ Robust optimization ์—ฐ๊ตฌ๋ฅผ ๋งŽ์ด ํ•˜๋Š” MIT์˜ Luca Carlone ๊ต์ˆ˜๋‹˜์˜ ์ตœ๊ทผ ๋…ผ๋ฌธ Estimation Contracts for Outlier-Robust Geometric Perception ์„ ๋ณด๋ฉด ์ด๋Ÿฐ ๋ง์ด ๋‚˜์˜จ๋‹คโ€ฆ (์ด ๊ต์ˆ˜๋‹˜์€ ์‚ฌ์ „์— outlier ์˜ ๋น„์œจ์ด ์•Œ๋ ค์ ธ์žˆ์ง€ ์•Š์„ ๋•Œ์—๋„, ๊ทธ๋ฆฌ๊ณ  99%์˜ outlier ๊ฐ€ ์žˆ์„ ๋•Œ์—๋„ ์–ด๋–ป๊ฒŒ ๊ฐ•๊ฑดํ•˜๊ฒŒ ์ตœ์ ํ™”๋ฅผ ํ•  ๊ฒƒ์ธ๊ฐ€.. ์ด๋Ÿฐ ์—ฐ๊ตฌ๋“ค์„ ํ•˜์‹ ๋‹ค)
                For the case with low-outlier rates (i.e., ฮฒ << 0.5), ...
              
      3. ์•”ํŠผ ํ•œํŽธ, Robust optimization ์„ ํ†ตํ•˜๋ฉด ์˜ค๋ฅธ์ชฝ ๊ทธ๋ฆผ์ฒ˜๋Ÿผ, ํšŒ์ƒ‰ ์šฉ์ด ํ•˜๋Š˜์ƒ‰ ์šฉ์˜ ์ˆ˜์ค€๊นŒ์ง€ ์˜ฌ๋ฐ”๋ฅด๊ฒŒ ์ •ํ•ฉ์ด ๋œ ๊ฒƒ์„ ์•Œ ์ˆ˜ ์žˆ๋‹ค.
  • ์ด์ œ๋ถ€ํ„ฐ (์ด๋Ÿฐ outlier correspondences ๊ฐ€ ์žˆ์„ ๋•Œ์—๋„) ์–ด๋–ป๊ฒŒ ์•ˆ์ „ํ•˜๊ฒŒ ์ตœ์ ํ™” ํ•˜๋Š”์ง€ ๊ฐ„๋‹จ ์ด๋ก  + ์‹ค์Šต ์„ ํ†ตํ•ด ์•Œ์•„๋ณด์ž!

Robust optimization ์„ ์œ„ํ•œ ์ ‘๊ทผ ๋‘ ์ข…๋ฅ˜

  • ์‹ค์Šต์— ์•ž์„œ, ๋จผ์ € ๊ฐ„๋‹จํ•˜๊ฒŒ ํ•„์š”ํ•œ ๊ฐœ๋…์— ๋Œ€ํ•ด ์•Œ์•„๋ณด์ž.

Least square optimization

  • Least square optimization ๋ฌธ์ œ์˜ ์›ํ˜•์€ ๊ถ๊ทน์ ์œผ๋กœ ๋ชจ๋‘ ์•„๋ž˜์™€ ๊ฐ™์€ ๋ชจ์–‘์„ ๋”ฐ๋ฅธ๋‹ค.

    • ์–ด๋–ค ๋ชจ๋ธ (cost function == observation model๊ณผ measurement ์˜ ์ฐจ์ด) $\text{x}$ ๊ฐ€ ์žˆ๊ณ , ์šฐ๋ฆฌ๋Š” ์ด๊ฒƒ๋“ค์˜ ์ด ํ•ฉ์„ ์ตœ์†Œํ™” ํ•˜๋Š” parameter $\theta$ ์˜ ์ตœ์ ๊ฐ’์„ ์•Œ๊ณ ์‹ถ๋‹ค.
      • ps. ์ด ๋•Œ ๋ณดํ†ต robotics ์—์„œ์˜ cost function ํ˜น์€ observation model ๋“ค์€ nonlinear ํ•œ ๊ฒฝ์šฐ๊ฐ€ ๋งŽ์•„์„œ, ์ € $\text{x}$ ์˜ ๋ฏธ๋ถ„์ธ ์ž์ฝ”๋น„์•ˆ์„ ๊ณ„์‚ฐํ•ด์•ผ ํ•˜๋Š”๋ฐ ์ด๋ถ€๋ถ„์„ SymForce ๊ฐ€ ์•Œ์•„์„œ ์ž˜ ํ•ด์ค€๋‹คโ€ฆ ์— ๋Œ€ํ•œ ๋‚ด์šฉ์„ ์ง€๋‚œ ํŠœํ† ๋ฆฌ์–ผ๋“ค์—์„œ ๊ณ„์† ๊ฐ•์กฐํ•˜์˜€์—ˆ๋‹ค (๊ทธ๋ž˜์„œ ์ด๋ฒˆ์—๋Š” ๊ทธ ์„ค๋ช…์€ ๋„˜์–ด๊ฐ€๋„๋ก ํ•˜๊ณ ).
    • ์ด ๋•Œ ์ € index ๋ณ€์ˆ˜ $i$๋ผ๋Š” ๊ฒƒ์€, ์˜ˆ๋ฅผ ๋“ค์–ด, ์œ„์˜ ์˜ˆ์‹œ point cloud ๊ทธ๋ฆผ์—์„œ ์šฉ์€ 1000๊ฐœ์˜ ํฌ์ธํŠธ๋กœ ๊ตฌ์„ฑ๋˜์–ด ์žˆ๊ณ , ์ด ๋•Œ 1000๊ฐœ์˜ correspondence ๊ฐ€ ์กด์žฌํ•œ๋‹ค๋ฉด, ์ด๊ฒƒ๋“ค์˜ index๋ฅผ ์˜๋ฏธํ•œ๋‹ค. ๊ทธ ์ค‘ ์–ด๋–ค i์— ๋Œ€ํ•ด์„œ๋Š” true correspondence ์ด๊ณ , ๊ทธ ์ค‘ ๋‹ค๋ฅธ i์— ๋Œ€ํ•ด์„œ๋Š” false correspondence ๊ฐ€ ์„ž์—ฌ ์žˆ๋Š” ๊ฒƒ์ด๋‹ค.
  • ์šฐ๋ฆฌ์˜ ๋ชฉํ‘œ๋Š” false correspondence ์˜ ์˜ํ–ฅ๋ ฅ์„ ์—†์• ๋Š” ๊ฒƒ์ด๋‹ค. ์–ด๋–ป๊ฒŒ ํ•  ์ˆ˜ ์žˆ์„๊นŒ?
    1. ๋จผ์ €, ๋ˆ„๊ฐ€ false correspondence ์ธ์ง€ ํ™•์‹คํ•˜๊ฒŒ ์ฐพ์•„์„œ ์ œ๊ฑฐํ•˜๋Š” ๋ฐฉ๋ฒ•์ด ์žˆ์„ ์ˆ˜ ์žˆ๋‹ค. ์ด๋ฅธ๋ฐ” explicit removal ์ด๋ผ๊ณ  ๋ถ€๋ฅผ ์ˆ˜ ์žˆ๊ฒ ๋‹ค.
    2. ๊ทธ๋Ÿฐ๋ฐ ๊ทธ๊ฒƒ์ด ๋งŒ์•ฝ ์–ด๋ ต๋‹ค๋ฉด, false correspondence โ€˜์ผ ๊ฒƒ ๊ฐ™์€โ€™ term ์•ž์— ์•„์ฃผ ์ž‘์€ weight ๋ฅผ ๊ณฑํ•ด์ฃผ๋ฉด ๋˜๊ฒ ๋‹ค. ์ด๋ฅธ๋ฐ” deweighting ๋ฐฉ๋ฒ•์ด๋ผ๊ณ  ํ•  ์ˆ˜ ์žˆ๊ฒ ๋‹ค.
  • ์ด ๋‘ ๋ถ€๋ฅ˜์— ๊ด€ํ•œ ์„ค๋ช…์€ Scale-Variant Robust Kernel Optimization for Non-linear Least Squares Problems ๋…ผ๋ฌธ์˜ 1์ชฝ์— ์ž˜ ์„ค๋ช…๋˜์–ด ์žˆ๋‹ค (1์ชฝ๋งŒ ์ฝ์–ด๋„ ์ถฉ๋ถ„ํ•จ).
    • ์—ฌ๊ธฐ์„œ๋„ ๊ฐ„๋‹จํ•˜๊ฒŒ ์ •๋ฆฌ๋ฅผ ํ•ด๋ณด์ž.

1. Explicit Removal

  • RANSAC์œผ๋กœ ๋Œ€ํ‘œ๋˜๋Š” ๊ณ„์—ด.
    • ์ด ๋ถ€๋ฅ˜์— ํ•ด๋‹นํ•˜๋Š” Robotics applications ์—์„œ์˜ (2022๋…„ 8์›”๊นŒ์ง€์˜) ๋…ผ๋ฌธ๋“ค ์—ญ์‹œ ๋ฐฉ๊ธˆ ์†Œ๊ฐœํ•œ ๋…ผ๋ฌธ Scale-Variant Robust Kernel Optimization for Non-linear Least Squares Problems ๋…ผ๋ฌธ ์— ์ž˜ ์ •๋ฆฌ๋˜์–ด์žˆ์œผ๋‹ˆ ์ฐธ๊ณ .
    • ๊ทธ๋ž˜์„œ (+RANSAC์€ ๋„ˆ๋ฌด ์œ ๋ช…ํ•˜๋‹ˆ) ์ด ํฌ์ŠคํŠธ์—์„œ๋Š” ์ž์„ธํ•œ ์„ค๋ช…์€ ์ƒ๋žตํ•œ๋‹ค.
    • ๊ตณ์ด Robotics์—์„œ์˜ ์ตœ๊ทผ ๋…ผ๋ฌธ ์ค‘ ํ•˜๋‚˜๋งŒ ๊ผฝ์ž๋ฉด 2018 ICRA Pairwise consistent measurement set maximization for robust multirobot map merging ๋…ผ๋ฌธ์„ ์ถ”์ฒœ.
      • ps. ์—ฌ๊ธฐ ๋‚˜์˜ค๋Š” maximal clique ๋ผ๋Š” ์šฉ์–ด์— ๋Œ€ํ•ด์„œ๋„ ์•Œ์•„๋‘๋ฉด ์ข‹๊ธฐ ๋•Œ๋ฌธ. max clique inlier selection (MCIS) ๋ผ๊ณ  ํ•ด์„œ TEASER++ (20 TRO Teaser: Fast and certifiable point cloud registration), Quatro (22 ICRA A Single Correspondence Is Enough: #Robust Global Registration to Avoid Degeneracy in Urban Environments) ๋“ฑ ์ตœ๊ทผ ๋…ผ๋ฌธ๋“ค์—์„œ ๋งŽ์ด ๋“ฑ์žฅํ•˜๋Š” ๊ฐœ๋…์ด๋‹ค.

2. (Implicit) Deweighting

  • M-estimator๋ผ๊ณ  ์ž์ฃผ ๋ถˆ๋ฆฌ๋Š” ๊ณ„์—ด.
    • ์•ž์„œ ์†Œ๊ฐœํ•œ consensus ๊ธฐ๋ฐ˜ outlier rejection ๊ธฐ๋ฒ•๋“ค์€ ํšจ๊ณผ๋Š” ์ข‹์ง€๋งŒ, ์‹œ๊ฐ„์ด ์˜ค๋ž˜๊ฑธ๋ฆฐ๋‹ค.
    • ๊ทธ๋ฆฌ๊ณ  ํ˜„์‹ค ์„ธ๊ณ„์—์„œ๋Š” ์–ด๋–ค pair ๊ฐ€ false correspondence ์ธ์ง€ ํŒ๋ณ„ํ•˜๋Š” ์ž‘์—…์ด ์—ฌ์ „ํžˆ ์–ด๋ ค์šธ ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ์—, ์•ž์˜ explicit removal ๊ณผ์ •์„ ๊ฑฐ์ณค๋”๋ผ๋„, ์—ฌ์ „ํžˆ false correspondence ๊ฐ€ 100% ์ œ๊ฑฐ๋˜์—ˆ์Œ์„ ๊ธฐ๋Œ€ํ•˜๊ธฐ๋Š” ์–ด๋ ค์šธ ์ˆ˜ ์žˆ๋‹ค.
    • ๊ทธ๋ž˜์„œ ์ตœํ›„์˜ ๋ฐฉ์–ด๋ผ์ธ?!์ด ํ•„์š”ํ•˜๋‹ค.
    • ๊ทธ ์—ญํ• ์„ ํ•˜๋Š” ๊ฒƒ์ด M-estimator ๋ผ๊ณ ๋„ ๋ถˆ๋ฆฌ๋Š” ๊ณ„์—ด์ด๋ฉฐ, โ€˜robust loss function (e.g., see Ceres)โ€™, โ€˜robust kernel ์„ ์”Œ์šด๋‹คโ€™ ๋“ฑ๋“ฑ์œผ๋กœ ๋ถ€๋ฅด๊ธฐ๋„ ํ•œ๋‹ค.
  • ์š”์•ฝ
    • False correspondence โ€˜์ผ ๊ฒƒ ๊ฐ™์€โ€™ term ์•ž์— ์ž‘์€ weight ๊ฐ’์ด ๊ณฑํ•ด์ง€๋„๋ก ํ•˜์ž๋Š” ๊ฒƒ์ด ํ•ต์‹ฌ์ด๋‹ค.
    • ๊ทธ๋Ÿฐ๋ฐ ์–ด๋–ป๊ฒŒ ํ•˜๋ƒ๋ฉด:
      • ์œ„์˜ least square optimization ์€ ์‚ฌ์‹ค iterative ํ•˜๊ฒŒ ํ‘ธ๋Š” ๊ฒƒ์ด๋‹ค. ์ฆ‰, initial estimate ์ด ์ฃผ์–ด์ ธ์žˆ์„ ๋•Œ ์–ด๋””๋กœ ์–ผ๋งˆ๋งŒํผ ์ด๋™ํ•ด์•ผ ์ตœ์ ํ•ด์— ๊ฐ€๊นŒ์›Œ์งˆ์ง€ โ€œdeltaโ€ ๋ฅผ ์ตœ์ ํ™”ํ•˜๋Š” ๊ฒƒ. ์ด ๋ถ€๋ถ„์— ๋Œ€ํ•ด์„œ๋Š” ๋Š˜ Grisetti ๊ต์ˆ˜๋‹˜์˜ tutorial ์ž๋ฃŒ ๋ฅผ ์ฐธ๊ณ ํ•˜๋ผ๊ณ  ์ด์•ผ๊ธฐ ํ•˜๋Š” ์ค‘.
      • ๋”ฐ๋ผ์„œ initial value ๊ฐ€ ์ ์ ˆํžˆ ์ข‹๋‹ค๋ฉด (์ ์ ˆํžˆ ์ข‹์•„์•ผ๋งŒ ๋ฌธ์ œ๋ฅผ ํ’€ ์ˆ˜ ์žˆ๋‹ค. ์šฐ๋ฆฌ์˜ observation model๋“ค์ด ๋Œ€์ฒด๋กœ nonlinear ํ•˜๊ธฐ ๋•Œ๋ฌธ์—), false correspondence ์˜ cost ๊ฐ’์€ ํด ๊ฒƒ์ด๋ผ๊ณ  ์˜ˆ์ธกํ•ด๋ณผ ์ˆ˜ ์žˆ๋‹ค.
      • ๊ทธ๋Ÿผ error ๊ฐ’์— ๋”ฐ๋ผ weight ๋ฅผ ๊ฒฐ์ •ํ•ด์ฃผ๋ฉด ๋˜๊ฒ ๋‹ค! ๋ผ๊ณ  ์ƒ๊ฐํ•ด๋ณผ ์ˆ˜ ์žˆ๊ฒ ๋‹ค. ๊ทธ๋Ÿฐ๋ฐ ๊ทธ๊ฒƒ์„ if๋ฌธ์œผ๋กœ ๋„๋ฐฐํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ. ์•„์ฃผ ์ž์—ฐ์Šค๋Ÿฝ๊ฒŒ. ์–ด๋–ค ํ•จ์ˆ˜ $\rho(\cdot)$ ๋ฅผ ๊ธฐ์กด cost function ์— ์”Œ์›Œ์ฃผ๊ธฐ๋งŒ ํ•˜๋ฉด ๋œ๋‹ค. ์ด ํ•จ์ˆ˜๋ฅผ kernel ์ด๋ผ๊ณ ๋„ ๋ถ€๋ฅธ๋‹ค.
    • ์ž์„ธํ•œ ๊ฑด ์•„๋ž˜ ์Šฌ๋ผ์ด๋“œ ์ฐธ๊ณ .
  • ์„ค๋ช… (slides from SNU talk)

    • ๊ทธ๋ž˜์„œ ์œ„์˜ ์Šฌ๋ผ์ด๋“œ์—์„œ๋„ ์†Œ๊ฐœํ–ˆ๋“ฏ 1995๋…„ Parameter Estimation Techniques: A Tutorial with Application to Conic Fitting ๋…ผ๋ฌธ์„ ๋ณด๋ฉด Huber, Cauchy, โ€ฆ ๋ญ ์ข…๋ฅ˜๊ฐ€ ๋งŽ๋‹ค.
      • ์–ด์จŒ๊ฑฐ๋‚˜ deweighting ํ•˜๋Š” ๋ฏผ๊ฐ๋„๋ฅผ ์กฐ๊ธˆ์กฐ๊ธˆ์”ฉ ๋‹ค๋ฅด๊ฒŒ ์„ค์ •ํ•˜๊ฒ ๋‹ค๋Š” ์ด์•ผ๊ธฐ์ด์ง€ ๊ฑฐ์˜ ๊ฐœ๋…์€ ๋น„์Šทํ•˜๋‹ค.
      • ํ•œํŽธ, 2019๋…„ CVPR์—์„œ John Barron ๋‹˜๊ป˜์„œ ์ด ๋ชจ๋“  ๋‹ค์–‘ํ•œ Kernel ๋“ค์€ ์‚ฌ์‹ค general ํ•œ ์ˆ˜์‹์˜ ํŠน์ˆ˜์˜ˆ๋“ค์— ์ง€๋‚˜์ง€ ์•Š์Œ! ์ด๋ผ๋Š” ๋…ผ๋ฌธ์„ ๋ƒˆ๋‹ค.
        • 2019 CVPR, A general and adaptive robust loss function
        • ๊ทธ๋ฆฌ๊ณ  ์ด๊ฒƒ์ด SymForce ์—์„œ BarronNoiseModel ์ด๋ผ๋Š” ์ด๋ฆ„์œผ๋กœ ์ด๋ฏธ ๊ตฌํ˜„์ด ๋˜์–ด ์žˆ๊ธฐ ๋•Œ๋ฌธ์—, ์•„๋ž˜ ์‹ค์Šต์—์„œ ์šฐ๋ฆฌ๋Š” ์ด๋ฅผ ์‚ฌ์šฉํ•ด๋ณผ ๊ฒƒ์ด๋‹ค.
        • ํ†บ์•„๋ณด๋ฉด, SymForce์˜ noise_models.py ์—์„œ ์ด๋Ÿฌํ•œ ์ฃผ์„์„ ๋ณผ ์ˆ˜ ์žˆ๋‹ค.
            # see the class BarronNoiseModel(ScalarNoiseModel) definition in noise_models.py
            """
                alpha: Controls shape and convexity of the loss function. Notable values:
                    alpha = 2 -> L2 loss
                    alpha = 1 -> Pseudo-huber loss
                    alpha = 0 -> Cauchy loss
                    alpha = -2 -> Geman-McClure loss
                    alpha = -inf -> Welsch loss
                delta: Determines the transition point from quadratic to robust. Similar to "delta" as used
                    by the pseudo-huber loss function.
                scalar_information: Scalar representing the inverse of the variance of an element of the
                    unwhitened residual. Conceptually, we use "scalar_information" to whiten (in a
                    probabalistic sense) the unwhitened residual before passing it through the Barron loss.
                x_epsilon: Small value used for handling the singularity at x == 0.
                alpha_epsilon: Small value used for handling singularities around alpha.
            """
          
    • ์ด ๊ณ„์—ด์˜ ๋ฐฉ๋ฒ•์€ ๊ตฌํ˜„ํ•˜๋Š” ์ž…์žฅ์—์„œ ๋˜๊ฒŒ straightforward ํ•˜๋‹ค๋Š” ๊ฒƒ์ด ์žฅ์ ์ด๋‹ค.

์‹ค์Šต

  • ์—ฌ๊ธฐ Jupyter notebook ์œผ๋กœ ์‹ค์Šต์„ ๊ฐ์ž ํ•ด๋ณผ ์ˆ˜ ์žˆ๋‹ค.
    • ์ „์ฒด์ ์ธ ์‹ค์Šต์ฝ”๋“œ์˜ ๋ฐฑ๋ณธ์€ ์ „ํŽธ๊ณผ ๋™์ผํ•˜๋‹ค.
    • ๋‹ค๋งŒ ์ด์ œ false correspondences ๋ฅผ ์„ž์–ด์ฃผ๋Š” ๋ถ€๋ถ„, robust kernel์„ ์”Œ์šฐ๋Š” ๋ถ€๋ถ„ ์ •๋„ ์˜ ์ˆ˜์ •์ด ์žˆ๋‹ค.
    • ๊ธฐ์กด ICP ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์ฝ”๋“œ๊ฐ€ ์™„์ „ํžˆ ๋™์ผํ•  ๋•Œ, ๋‹จ์ˆœํžˆ robust kernel ์„ ์”Œ์šฐ๋Š” ๊ฒƒ (์ตœ์†Œํ•œ์˜ ์ฝ”๋“œ์ˆ˜์ •!)๋งŒ์œผ๋กœ๋„ outlier ์— ๊ฐ„ํŽธํ•˜๊ฒŒ ๋Œ€์‘ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์ด M-estimation ๋ฐฉ๋ฒ•์˜ ์žฅ์ ์ด๋‹ค!
      • ์ง์ ‘ ์‹ค์Šต์„ ํ†ตํ•ด ๋Š๊ปด๋ณด์ž.
      • ps. ๋ฌผ๋ก  trick๋“ค์„ ์ข€ ์„ž์–ด์ฃผ๋ฉด ์ข‹๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, ์œ„์˜ ์‹ค์Šต์ฝ”๋“œ์— ๋ณด๋ฉด ์ตœ์ ํ™” ๋Œ€์ƒ์ด ๋˜๋Š” 7-dim state vector ์—์„œ ํŠน์ • dimension (ํŠนํžˆ scale๋ถ€๋ถ„) ์˜ ๊ฐ’ ํฌ๊ธฐ๋ฅผ ๋‹ค๋ฅธ element ๋Œ€๋น„ ์ข€ ์ž‘๊ฒŒ rescaling ํ•ด์ฃผ๋ฉด ์ˆ˜๋ ด์ด ๋”์šฑ ์•ˆ์ •์ ์œผ๋กœ ๋œ๋‹ค.
  • ์ž์„ธํ•œ ์‹ค์Šต๊ณผ์ •์€ ์˜์ƒ์œผ๋กœ ๋Œ€์ฒดํ•จ.

๊ฒฐ๋ก 

์š”์•ฝ

  1. Robust optimization ์˜ ๋‘ ๋ถ€๋ฅ˜ (1. outlier pruning, 2. deweighting) ์— ๋Œ€ํ•ด ์•Œ์•„๋ณด์•˜๋‹ค.
  2. Outlier๊ฐ€ ์žˆ์„ ๋•Œ์—๋„ Sim(3) registration ์ด ์ž˜๋˜๋Š” ์‹ค์Šต์„ ํ•ด๋ณด์•˜๋‹ค.
    1. SymForce์˜ Barron loss ๋ฅผ ์‚ฌ์šฉํ•ด์„œ deweighting ์„ ์‰ฝ๊ฒŒ ๊ตฌํ˜„ํ•  ์ˆ˜ ์žˆ๋‹ค.
    2. robust loss ์—†์ด๋Š” ๋‹จ 5%์˜ outlier๋งŒ ์žˆ๋”๋ผ๋„ ํ•ด๊ฐ€ ์ˆ˜๋ ดํ•˜์ง€ ์•Š๋Š”๋‹ค.
    3. ๋ฐ˜๋ฉด 50% ์˜ outlier ๊ฐ€ ์žˆ์–ด๋„ ํ•ด๊ฐ€ ๊นจ์ง€์ง€ ์•Š๊ณ  ์ž˜ ์ˆ˜๋ ดํ•œ๋‹ค.
      • note: ํ•˜์ง€๋งŒ ์–ด๋Š์ •๋„ ์ˆ˜๋ ด ์ƒํ•œ์„ ์ด ์žˆ๊ธฐ ๋•Œ๋ฌธ์—, robust loss ๋ฅผ ํ†ตํ•ด ์•ˆ์ „ํ•˜๊ฒŒ ์–ด๋Š์ •๋„ estimate ์„ ์˜ˆ์ธกํ•˜๊ณ  ๋‚œ ๋‹ค์Œ์—๋Š”, correspondence ์ฐพ๋Š” ๊ณผ์ •์„ ๋‹ค์‹œ ์ˆ˜ํ–‰ํ•ด์ฃผ์–ด์„œ, outlier ๋น„์œจ ์ž์ฒด๋ฅผ ๋‹ค์‹œ ํ•œ๋ฒˆ ์ค„์—ฌ์ฃผ๋ ค๋Š” ๋…ธ๋ ฅ์ด ํ˜„์‹ค๋ฌธ์ œ๋ฅผ ํ’€ ๋•Œ์—๋Š” ์š”๊ตฌ๋  ๊ฒƒ์ด๋‹ค.

์—ฌ๋‹ด

Pose-graph optimization ์—์˜ ์ ์šฉ

  • ์˜ˆ์ „ ํฌ์ŠคํŠธ ์—์„œ๋„ ์•Œ์•„๋ณด์•˜๋“ฏ์ด, pose-graph optimization์„ ํ†ตํ•ด ๋ˆ„์ ๋œ trajectory์˜ drift๋ฅผ ๊ทน๋ณตํ•˜๊ณ  globally consistent ํ•œ map์„ ๋งŒ๋“œ๋Š” ๋ฐ ๊ธฐ์—ฌํ•  ์ˆ˜ ์žˆ๋‹ค. ์ด ์˜ˆ์ œ์—์„œ๋Š” ๊ฐ„๋‹จํ•œ circle ํ˜•ํƒœ์˜ trajectory์˜€์—ˆ๊ณ , ์—ญ์‹œ correspondence ๊ฐ€ ์ฃผ์–ด์ ธ์žˆ์—ˆ๋‹ค. ์ฆ‰, ํ•œ ๋ฐ”ํ€ด๋ฅผ ๋Œ์•˜๋‹ค๋Š” ์‚ฌ์‹ค์„ ์ด๋ฏธ ์•Œ๊ณ  ์žˆ์—ˆ๊ณ , ์ œ์ผ ์ฒซ๋ฒˆ์งธ node์™€ ์ œ์ผ ๋งˆ์ง€๋ง‰ node ์‚ฌ์ด๋ฅผ ์ด์–ด์ฃผ๋ฉด ๋œ๋‹ค๋Š” ๊ฒƒ์ด ์•Œ๋ ค์ ธ์žˆ์—ˆ๋‹ค.
  • ๋ฌผ๋ก , ํ˜„์‹ค์„ธ๊ณ„์—์„œ ๊ทธ ์‚ฌ์‹ค์€ ๊ทธ๋ ‡๊ฒŒ ์‰ฝ๊ฒŒ ์•Œ๋ ค์ง€์ง€๋Š” ์•Š๊ณ  โ€ฆ ์ด๊ฒƒ์— ๋Œ€ํ•ด ์—ฐ๊ตฌํ•˜๋Š” ๋ถ„์•ผ๊ฐ€ place recognition ์ด๋‹ค.
  • ํ˜„์‹ค ์„ธ๊ณ„์—์„œ๋Š” perceptual aliasing ์ด ๋นˆ๋ฒˆํ•˜๊ธฐ ๋•Œ๋ฌธ์—, ์—ญ์‹œ ์ด place recognition ์„ ์ˆ˜ํ–‰ํ•  ๋•Œ์—๋„ false correspondence ๊ฐ€ ๋งŒ๋“ค์–ด์ง€๊ธฐ ์‰ฝ๋‹ค.
    • ์ฆ‰ ์˜ˆ๋ฅผ ๋“ค์–ด ์•„๋ž˜์™€ ๊ฐ™์ด ..

  • ๊ทธ๋ž˜์„œ pose-graph optimization ์„ ์ˆ˜ํ–‰ํ•  ๋•Œ์—๋„ ์•ž์„œ ์†Œ๊ฐœํ•œ Deweighting ์„ ํ•„์ˆ˜๋กœ ์ ์šฉํ•˜๋Š” ๊ฒƒ์ด ์ข‹๋‹ค.
    • ๋งŒ์•ฝ ๊ทธ๋ ‡์ง€ ์•Š๋‹ค๋ฉด โ€ฆ ์ „์ฒด์ ์œผ๋กœ Trajectory ๊ฐ€ ๋ง๊ฐ€์ง€๊ฒŒ ๋œ๋‹ค. ์•„๋ž˜์™€ ๊ฐ™์ด โ€ฆ (๊ทธ๋ฆผ์€ 13 ICRA Robust map optimization using dynamic covariance scaling)

  • Deweighting (๊ฐ„๋‹จํ•˜๊ฒŒ Cauchy kernel) ์„ ์ ์šฉํ•˜๋ฉด ์•„๋ž˜์™€ ๊ฐ™์ด, false pair ๊ฐ€ ์•„์ฃผ์•„์ฃผ ๋งŽ์•„๋„ ์ „์ฒด trajectory๊ฐ€ ๊นจ์ง€์ง€ ์•Š๊ณ  ์ž˜ ๊ตฌ์„ฑ๋œ ๊ฒƒ์„ ์•Œ ์ˆ˜ ์žˆ๋‹ค.
    • ์•„๋ž˜ ๊ทธ๋ฆผ์€ SC-A-LOAM ์„ ์ˆ˜ํ–‰ํ•œ ๊ฒฐ๊ณผ ์˜ˆ์‹œ.

The three important things in computer vision

  • ์—ฌ๋‹ด์œผ๋กœ ์žฌ๋ฏธ์žˆ๋Š” ์ด์•ผ๊ธฐ ํ•˜๋‚˜.
  • CMU ๋ฐ•์‚ฌ๋ฅผ ์กธ์—…ํ•œ Xiaolong Wang ๊ต์ˆ˜๋‹˜ ์˜ ๋ฐ•์‚ฌํ•™์œ„๋…ผ๋ฌธ์„ ๋ณด๋ฉด ์ด๋Ÿฐ๋ง์ด ๋‚˜์˜จ๋‹ค.

    • Computer vision ์—์„œ ๊ทธ ์œ ๋ช…ํ•œ Takeo Kanade (Lucas-Kanade ์˜!) ๊ต์ˆ˜๋‹˜์ด ํ•œ ๋ง์ด๋ผ๊ณ  (๋ฏฟ๊ฑฐ๋‚˜ ๋ง๊ฑฐ๋‚˜) ํ•œ๋‹ค.
    • Computer vision ์ด ๋‹ค๋ฃจ๋Š” ํƒœ์Šคํฌ ์ค‘ ์ œ์ผ ์ค‘์š”ํ•œ ์„ธ ๊ฐ€์ง€๋ฅผ ๊ผฝ์œผ๋ผ๋ฉด correspondence์™€ correspondence์™€ correspondence ๋ผ๊ณ  โ€ฆ
    • ์ด๋ฒˆ ํฌ์ŠคํŠธ์—์„œ ๋ฐฉ์–ด์  ์—ญํ• ์„ ํ•˜๋Š” m-estimation ๊ธฐ๋ฐ˜์˜ robuist optimization์— ๋Œ€ํ•ด ์•Œ์•„๋ณด์•˜์ง€๋งŒ, ์–ด์จŒ๋“  ์ด๋Š” ๋ฐฉ์–ด์ ์ธ๊ฒƒ์ด๊ณ .. ์•ž์˜ ํฌ์ŠคํŒ…๋“ค์—์„œ ์‚ดํŽด๋ณด์•˜๋“ฏ correspondence ๊ฐ€ true ๋ฉด ์ตœ์ ํ™”๋Š” ์ด๋ก ์ ์œผ๋กœ ์ž˜ ๋ ์ˆ˜๋ฐ–์— ์—†๊ธฐ ๋•Œ๋ฌธ์ด๋‹ค.
    • ์ตœ๊ทผ ๋”ฅ๋Ÿฌ๋‹์ด ์ •๋ง ์ž˜ํ•˜๋Š” ๋ถ€๋ถ„๋„ ๋ฐ”๋กœ ์ด๋Ÿฐ ๋ถ€๋ถ„์ด ๋˜๊ฒ ๋‹ค.
      • ์˜ˆ๋ฅผ ๋“ค์–ด SuperPoint, SuperGlue, LoFTR ๋กœ ๋Œ€ํ‘œ๋˜๋Š” ๋”ฅ ๋งค์นญ ๋ฐฉ๋ฒ•๋“ค.
      • ๊ทธ๋ž˜์„œ Robotics ์—์„œ ๋˜ ์ค‘์š”ํ•œ ๋ฌธ์ œ ์ค‘ ํ•˜๋‚˜์ธ Visual Localization (== cameraโ€™s 6D pose estimation) ์—์„œ๋„ SuperGlue ์— ๊ธฐ๋ฐ˜ํ•œ ๋ฐฉ๋ฒ•์ด Sota ๋ฅผ ์ฐ๊ธฐ๋„ ํ•˜์˜€๋‹ค.
      • ๊ทธ๋ž˜์„œ ์—ญ์‹œ visual localization์—์„œ๋„ correspondence๋งŒ ์ž˜ ์˜ˆ์ธก๋˜๋ฉด, ์‹ค์ œ๋กœ ์ตœ์ ํ™” ์ž์ฒด๋Š” ๊ณ„์‚ฐ๊ธฐ์ด๊ณ  ๊ฒฐ๊ณผ์ ์œผ๋กœ ๋Œ€์ฒด๋กœ ์ข‹์€ ํ’ˆ์งˆ์˜ ํ•ด๋ฅผ ์–ป์„ ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ
        • ๋”ฅ๋Ÿฌ๋‹ ์ดˆ๊ธฐ (2015๋…„) ์—๋Š” ๋„คํŠธ์›Œํฌ์— (input, output) data ๋งŒ ์ฃผ๊ณ  end-to-end ๋กœ ํ•™์Šต์‹œํ‚ค๋ฉด ๋ญ”๊ฐ€ ๊ทธ์•ˆ์—์„œ ๋งˆ๋ฒ•๊ฐ™์€ ์ผ์ด ์ผ์–ด๋‚˜์„œ ์ƒˆ๋กœ์šด input์ด ๋“ค์–ด๊ฐ€๋„ ๊ทธ๊ฒƒ์˜ pose ๋ฅผ ์ž˜ ์•Œ๋ ค์ค„ ๊ฑฐ์•ผ (15 CVPR Posenet: A convolutional network for real-time 6-dof camera relocalization) ๋ผ๋Š” ๋ฏฟ์Œ์ด ์žˆ์—ˆ๋Š”๋ฐ ..,
        • ์ด๊ฑธ ํ•ด๋ณด๋ฉด ์ƒ๊ฐ๋ณด๋‹ค ์ž˜ ์•ˆ๋œ๋‹ค. ์ด๋Ÿฐ ์‹ฌ์ฆ์ด ์—ฐ๊ตฌ์ž๋“ค ์‚ฌ์ด์—์„œ ๊ณ„์† ์žˆ๋‹ค๊ฐ€,
        • ์ดํ›„ CVPR 2019์— ์™€์„œ Understanding the Limitations of CNN-based Absolute Camera Pose Regression ๋ผ๋Š” ๋…ผ๋ฌธ์—์„œ ๊ทธ ๋ฏฟ์Œ์ด ์ œ๋Œ€๋กœ ์ง€์ ๋‹นํ•œ๋‹ค. ๊ทธ๋ฆฌ๊ณ  SuperGlue์ €์ž์˜ Hloc ์ด ์‹คํ—˜์ ์œผ๋กœ๋„ ๋ณด์—ฌ์ฃผ๊ฒŒ ๋œ๋‹ค: ๋”ฅ๋Ÿฌ๋‹์€ feature correspondence ๋งŒ ์ž˜ ์ง€์–ด์ฃผ๋ฉด ๋œ๋‹ค. 2D-3D matches ๋งŒ ์ž˜ ์ƒ์„ฑ๋˜๋ฉด ๊ทธ ์ดํ›„๋Š” ๊ธฐ์กด์— ์ž˜ ์ •๋ฆฝ๋œ ์ˆ˜ํ•™์ธ PnP ๋งŒ์œผ๋กœ๋„ ์ถฉ๋ถ„ํ•˜๋‹ค (์ด ๋ถ€๋ถ„์€ ๋”ฅ์ด ํ•  ๊ฒŒ ์•„๋‹ˆ๋ผ๋Š” ๊ฒƒ).. ๋ผ๋Š” ์ด์•ผ๊ธฐ.
          • ๋ฌผ๋ก  ์ด๋Ÿฐ conventional ํ•œ geometric pipeline (e.g., svd, pnp, โ€ฆ) ์ž์ฒด๋ฅผ differentiable ํ•˜๊ฒŒ ๊ตฌ์„ฑํ•˜๋Š” ๊ฒƒ์€ ์˜ณ์€ ๋ฐฉํ–ฅ์ด๋‹ค! ๊ทธ ๋…ธ๋ ฅ์ด ๋ฐ”๋กœ Kornia ๋ผ๋Š” ํ”„๋กœ์ ํŠธ.
    • ์‚ฌ์„ค์ด ๊ธธ์—ˆ์œผ๋‹ˆ Localization ์ด์•ผ๊ธฐ๋Š” ๋‚˜์ค‘์— ๋˜ ๋”ฐ๋กœ ํ•ด๋ณด๋„๋ก ํ•˜์ž.