(SVM) zhangh04@gail.co (SVM),,,,,,,,,,,,,,,,,,,,.,, {(x, y ), (x, y ),..., (x, y )}, x i R d, y {, }, h: R {, }, h(x i ) = y i, y i = ; h(x i ) = () y i =. i. y i h(x i ) =. (), x i, h(x i ) := sign(w x i + b), (3) w i R d, b R.. (w, b), i. y i (w x i + b) > 0. (4),, y i h(x i ) = y i sign(w x i +b) = y i (w x i +b) > 0. (5) (SVM) [4],, (w, b)., SVM,,,,,
,,,., (argin),. R d p R d w x + b = 0 w w p + b. (6) x, x, w (x x ) = w x w x = ( b) ( b) = 0, (7) w (x x ). x x, w p p x (, w) : proj w (p x) = p x cos w, p x = p x w (p x) w p x = w w p w x = w w p + b. (8) ( γ)., γ := i w w x i + b., 3. (w, b), ax i w w x i + b (9) s. t. y i (w x i + b) > 0, i =,,...,.,,,. 3,,,, (QP) ( )., u u Qu + t u (0) s. t. c i u d i, i =,,...,. 4. (w, b ) 3, r > 0, (rw, rb ) rw (rw ) x i + rb = w w x i + b, () y i ((rw ) x i + rb ) > 0 y i (w x i + b ) > 0. () (w, b),, (w, b) i w x i + b =. (3) 5 ( ). 3 (w, b), w w (4) s. t. y i (w x i + b), i =,,...,., (w, b ), i y i (w x i + b ) >. (rw, rb), 0 < r <, i y i ((rw) x i + rb) =, rw < w. (w, r ),, 4 arg w w (5) s. t. i y i (w x i + b) =. w w = arg = arg ax w w
( = arg ax ( = arg ax ) w y i(w x i + b) ) i i w w x i + b.(6) 6., d +, 0 3 [ ] [ ] w I 0 u :=, Q :=, t := 0, (7) b 0 0 [ ] xi c i := y i, d i :=, (8), 5, (Lagrange) (dual), 3. 3 ( ). u f(u) (9) s. t. g i (u) 0, i =,,...,, L(u, α, β) := f(u) + α i 0. h j (u) = 0, j =,,..., n, α i g i (u) + n β j h j (u), (0) j= 7. 9 ax u α,β L(u, α, β) () s. t. α i 0, i =,,...,. ax L(u, α, β) u α,β = f(u) + ax α i g i (u) + u α,β n β j h j (u) j= 0 u ; = f(u) + u = u f(u), u, (), g i, g i (u) > 0, α i =, α i g i (u) = ; h j, h j (u) 0, β j = sign(h j (u)), β j h j (u) =. u, α i 0, g i (u) 0, α i g i (u) 0. α i g i (u) 0. 8 (KKT ). : g i (u) 0, h i (u) = 0; : α i 0; (copleentary slackness): α i g i (u) = 0. 7, u,. α i g i (u) = 0 4 ( ). 9 ax α,β u L(u, α, β) (3) s. t. α i 0, i =,,...,. 9. (prial), ax α,β u L(u, α, β) u ax L(u, α, β). (4) α,β. (α, β ), u L(u, α, β ) u ax α,β L(u, α, β). (α, β ) = ax α,β u L(u, α, β ),, ax α,β u L(u, α, β ) u ax α,β L(u, α, β). 0 (Slater )., f g i, h j,, 3
, [].. Slater w w y i (w x i + b) 3. L(w, b, α) := w w + ax α w w + α i ( y i (w x i + b)). (5) α i ( y i (w x i + b)) (6) s. t. α i 0, i =,,...,. ( ). α, α s. t. α i α j y i y j x i x j α i (7) j= α i y i = 0, α i 0, i =,,...,. 6 (w, b), (w, b) L w = 0 w = α i y i x i, (8) L b = 0 α i y i = 0. (9) 6, (w, b), 3.,, + u := α, Q := [y i y j x i x j ], t :=, (30) c i := e i, d i := 0, i =,,...,, (3) c + := [y y y ], d + := 0, (3) c + := [y y y ], d + := 0, (33) 0, e i i, 0 c +u d + c +u d + 3.3 4 ( KKT ). KKT : y i (w x i + b) 0; : α i 0; : α i ( y i (w x i + b)) = 0. [ ] [ ] w xi u :=, g i (u) := y i u, (34) b 8 5 ( ). α i > 0 5.,, KKT, α i ( y i (w x i + b)) = 0. α i > 0, y i (w x i + b) = 0. y i (w x i + b) =. 6. (w, b), α i > 0, w = α i y i x i = i : α i=0 0 y i x i + α i y i x i i : α i>0 = α i y i x i, (35) i SV SV b x s y s, y s (w x s + b) =, b = y s w x s = y s α i y i x i x s. (36) i SV, b, b 4
7. ( ) h(x) = sign α i y i x i x + b. (37) 35 4 i SV,,, (kernel trick) []. 4. R d, ϕ: R d R d, R d 8. d, d, R d, (shatter) [6]. ϕ(x) x R d, w d. : α s. t. w w (38) s. t. y i (w ϕ(x i ) + b), i =,,..., ; α i α j y i y j ϕ(x i ) ϕ(x j ) α i (39) j= α i y i = 0, α i 0, i =,,...,., d +, ;, + 4.,,, ϕ(x i ) ϕ(x j ). R d,, O( d).,,, O( d) O(d)., κ(x i, x j ), κ(x i, x j ) = ϕ(x i ) ϕ(x j ), (40) κ(x i, x j ) O(d). 9. ϕ: x exp( x ) x! x. (4) κ(x i, x j ) := exp( (x i x j ) ). (4) κ(x i, x j ) = exp( (x i x j ) ) = exp( x i ) exp( x j) exp(x i x j ) = exp( x i ) exp( x (x i x j ) k j) k k=0 ( )( = exp( x k i ) exp( x k j) k=0 k! xk i k! xk j = ϕ(x i ) ϕ(x j ). (43) 4.3,,,,, d ( ), ) 5
; d, RBF ; d,,, Mercer [5]. 0 (Mercer ). κ(x i, x j ) K := [κ(x i, x j )] (44),. : K ij = κ(x i, x j ) = ϕ(x i ) ϕ(x j ), Φ := [ϕ(x ) ϕ(x ) ϕ(x )] R d, (45) K = Φ Φ. a, a Ka = a Φ Φa = (Φa) (Φa) = Φa 0. (46) [9]. κ(x i, x j ), c κ (x i, x j ) + c κ (x i, x j ), c, c > 0, (47) κ (x i, x j )κ (x i, x j ), (48) f(x )κ (x i, x j )f(x ). (49) : κ(x i, x j ) = ϕ(x i ) ϕ(x j ), c κ (x i, x j )+c κ (x i, x j ) = κ (x i, x j )κ (x i, x j ) [ c ] [ c ] ϕ (x i ) ϕ (x i ), c ϕ (x i ) c ϕ (x i ) (50) = vec(ϕ (x i )ϕ (x i ) ) vec(ϕ (x j )ϕ (x j ) ), (5) f(x )κ (x i, x j )f(x ) = (f(x i )ϕ(x i )) (f(x j )ϕ(x j )). (5) 4.4, ( ). l(w ϕ(x i ), y i ) + λ w w (53) w w = α i ϕ(x i ). (54) Φ := [ϕ(x ) ϕ(x ) ϕ(x )]. (55) w, α, e 0. w = Φα + e, (56), e, ϕ(x i ), ϕ(x i ) e = 0. l(w ϕ(x i ), y i ) = l((φα + e) ϕ(x i ), y i ) = l((φα) ϕ(x i ), y i ) ; (57) w = Φα + e + (Φα) e > Φα, (58) Φα w, w,, Ω(w). [3].,,,,, w ϕ(x) w ϕ(x) = α i κ(x i, x). (59),, [8]. 5,,,,,,,, 6
Table :,, (chi squared kernel), (histogra intersection kernel). x i x j, (βx i x j + θ) n, n, n RBF exp ( x i x j,, σ ) 5.,, : w w + C I(y i sign(w ϕ(x i ) + b)) (60) s. t. y i (w ϕ(x i ) + b), y i = sign(w ϕ(x i ) + b)., I( ), C,,, 60 0/, /,, (slack variable) ξ i,,, 0 y i (w ϕ(x i ) + b) ; ξ i = y i (w ϕ(x i ) + b). (6) 3 ( ). (w, b),,ξ w w + C ξ i (6) s. t. y i (w ϕ(x i ) + b) ξ i, i =,,...,, ξ i 0, i =,,...,., C C, ; C,. y i (w ϕ(x i ) + b), y i (w ϕ(x i ) + b) ξ i ξ i 0, ξ i, ξ i = 0., ξ i y i (w ϕ(x i ) + b), ξ i, ξ i = y i (w ϕ(x i ) + b). 4., + d +, w [ ] [ ] u := b I 0 0, Q :=, t := C, (63) 0 0 ξ y i ϕ(x i ) c i := y i, d i :=, i =,,...,, (64) e i [ ] 0 c i :=, d i := 0, i = +,...,, (65) e i 0 5. 5 ( ). α, α s. t. α i α j y i y j ϕ(x i ) ϕ(x j ) α i (66) j= α i y i = 0, 7
0 α i ξ i, i =,,...,. L(w, b, ξ, α, β) := w w + C ax α,β,ξ + + ξ i α i ( ξ i y i (w ϕ(x i ) + b)) β i ( ξ i ). (67) L(w, b, ξ, α, β) (68) s. t. α i 0, i =,,...,, (69) β i 0, i =,,...,. (w, b, ξ), (w, b, ξ) L w = 0 w = α i y i ϕ(x i ), (70) L b = 0 α i y i = 0, (7) L ξ = 0 α i + β i = C. (7) β i = C α i 0,, 0 α i C, β i. 68, (w, b, ξ, β), 6.,, +. u := α, Q := [y i y j ϕ(x i ) ϕ(x j )], t :=, (73) c i := e i, d i := 0, i =,,...,, (74) c i := e i, d i := ξ i, i = +,...,, (75) c + := [y y y ], d + := 0, (76) c + := [y y y ], d + := 0, (77) 0 5.3 7 ( KKT ). KKT : ξ i y i (w ϕ(x i )+b) 0, ξ i 0; : α i 0, β i 0; : α i ( ξ i y i (w ϕ(x i )+b)) = 0, β i ξ i = 0. w u := b, (78) ξ y i w g i (u) := y i u, i =,,...,, (79) e i [ ] 0 g i (u) := u, i = +,...,. (80) e i 8 8.,,, KKT, α i ( ξ i y i (w ϕ(x i ) + b)) = 0 β i ξ i = 0. α i > 0, ξ i y i (w ϕ(x i ) + b) = 0. 0 < α i < C. β i = C α i > 0. ξ i = 0, ; α i = C. β i = C α i = 0. ξ i, ; ξ i >, 9. (w, b), 5.4 30. 6 ξ i = ax(0, y i (w ϕ(x i ) + b)). (8), y i (w ϕ(x i ) + b) 0, ξ i = 0;, y i (w ϕ(x i ) + b) > 0, ξ i = y i (w ϕ(x i ) + b). 8
3. ax(0, y i (w ϕ(x i ) + b) + λ w. (8),, ;,,, λ,., ξ i = ax(0, y i (w ϕ(x i ) + b)) 0, λ = C. 6 ( (hinge loss)). l(s) = ax(0, s). (83), [9],. s := y i w ϕ(x),, s > 0, l(s) ;, s < 0, l(s) 6 6. SMO, Q := [y i y j ϕ(x i ) ϕ(x j )] O( ),, (SMO) [0] SMO 7 ( ).,,,,., α i, α i y iα i = 0,, α i., α i., SMO α i α j,, Algorith Input: f. Output: u, f(u) : while do : for i to n do 3: u i arg ui f(u) 4: end for 5: end while 6: return u 3 (SMO ). SMO α i,α j (α i yi ϕ(x i ) ϕ(x i ) + αi yi ϕ(x j ) ϕ(x j ) + α i α j y i y j ϕ(x j ) ϕ(x j )) (α i + α j ) (84) s. t. α i y i + α j y j = c, 0 α i ξ i, 0 α j ξ j,, c := k i,j α ky k. 68 α i, α j 33. SMO α i α j = y j (c α i y i ), SMO, α j., α i, L α i H. [L, H], α i, α i α j, α i KKT, α j α i SMO KKT 6. Pegasos,,, Pegasos [4]. Pegasos ax(0, y i (w x i + b)) + λ w. (85) 9
Table : s := y i w ϕ(x). 0/ I(s < 0) ;,, NP ax(0, s), 0/ ;,, log( + exp( s)), 0/ ;,, exp( s), 0/ ;, AdaBoost,,. Algorith Pegasos. Input: {(x, y ), (x, y ),..., (x, y )}. Output: (w, b) : while do J : w I(y i(w x i + b) ) y i x i + λw J 3: w I(y i(w x i + b) ) y i 4: w w η J w 5: b b η J b 6: end while 7: return (w, b) 6.3, K := [κ(x i, x j )], Ω( ).,, CVM [5], Nyströ [8] K K, [],, LibLinear [7] LibSVM [3], 7 ProbSVM.,, ProbSVM [], (w, b). s i := y i w ϕ(x i ) + b, {(s, y ), (s, y ),..., (s, y )}, (θ, θ 0 )., Prob- SVM h(x) := sig(θ (w ϕ(x) + b) + θ 0 ). (86), ( θ ) ( θ 0 ). θ > 0, θ 0 0. K, [7] K {(w, b ), (w, b ),..., (w K, b K )},, W,b k= K ax(0, (w y i ϕ(x i ) + b yi ) (w k ϕ(x i ) + b k ) + ) + λ K w k w k. (87) k= (SVR). h(x i ) y i, [6] h(x i ) y i ε s := y (w ϕ(x) + b, ε- 0 y (w ϕ(x) + b) ε ; s ε. 34 ( ). (88) ax(0, y i (w ϕ(x i )+b) ε)+ λ w w. (89) y (w ϕ(x)+b) ε, y (w ϕ(x)+b) ε 0, ax(0, ) 0; y (w ϕ(x) + b) > ε, ax(0, ) y (w ϕ(x) + b) ε. 0
References [] B. E. Boser, I. M. Guyon, and V. N. Vapnik. A training algorith for optial argin classifiers. In Proceedings of the Annual Workshop on Coputational Learning Theory, pages 44 5, 99. 5 [] S. Boyd and L. Vandenberghe. Convex optiization. Cabridge university press, 004. 4 [3] C.-C. Chang and C.-J. Lin. LIBSVM: A library for support vector achines. ACM Transactions on Intelligent Systes and Technology, (3):7, 0. 0 [4] C. Cortes and V. Vapnik. Support-vector networks. Machine Learning, 0(3):73 97, 995. [5] N. Cristianini and J. Shawe-Taylor. An introduction to support vector achines and other kernel-based learning ethods. Cabridge University Press, 000. 6 [6] H. Drucker, C. J. Burges, L. Kaufan, A. J. Sola, and V. Vapnik. Support vector regression achines. In Advances in Neural Inforation Processing Systes, pages 55 6, 997. 0 [7] R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin. LIBLINEAR: A library for large linear classification. Journal of Machine Learning Research, 9(8):87 874, 008. 0 [8] T. Hofann, B. Schölkopf, and A. J. Sola. Kernel ethods in achine learning. The Annals of Statistics, pages 7 0, 008. 6 [9] G. R. Lanckriet, N. Cristianini, P. Bartlett, L. E. Ghaoui, and M. I. Jordan. Learning the kernel atrix with seidefinite prograg. Journal of Machine Learning Research, 5():7 7, 004. 6 [0] J. Platt. Sequential ial optiization: A fast algorith for training support vector achines. Micriosoft Research, 998. 9 [] J. Platt et al. Probabilistic outputs for support vector achines and coparisons to regularized likelihood ethods. Advances in Large Margin Classifiers, 0(3):6 74, 999. 0 [] A. Rahii and B. Recht. Rando features for largescale kernel achines. In Advances in Neural Inforation Processing Systes, pages 77 84, 008. 0 [3] B. Scholkopf and A. J. Sola. Learning with kernels: support vector achines, regularization, optiization, and beyond. MIT press, 00. 6 [4] S. Shalev-Shwartz, Y. Singer, N. Srebro, and A. Cotter. Pegasos: Prial estiated sub-gradient solver for SVM. Matheatical Prograg, 7():3 30, 0. 9 [5] I. W. Tsang, J. T. Kwok, and P.-M. Cheung. Core vector achines: Fast SVM training on very large data sets. Journal of Machine Learning Research, 6(4):363 39, 005. 0 [6] V. Vapnik. The nature of statistical learning theory. Springer Science & Business Media, 03. 5 [7] J. Weston, C. Watkins, et al. Support vector achines for ulti-class pattern recognition. In Proceedings of the European Syposiu on Artificial Neural Networks, volue 99, pages 9 4, 999. 0 [8] C. K. Willias and M. Seeger. Using the nyströ ethod to speed up kernel achines. In Advances in Neural Inforation Processing Systes, pages 68 688, 00. 0 [9], 06. 9