國立中山大學學位論文典藏.PDF

Similar documents
SVM OA 1 SVM MLP Tab 1 1 Drug feature data quantization table

香 港 舞 蹈 總 會    北 京 舞 蹈 學 院

39898.indb

穨ecr2_c.PDF

電腦相關罪行跨部門工作小組-報告書

i

发展党员工作手册

i

一、

[9] R Ã : (1) x 0 R A(x 0 ) = 1; (2) α [0 1] Ã α = {x A(x) α} = [A α A α ]. A(x) Ã. R R. Ã 1 m x m α x m α > 0; α A(x) = 1 x m m x m +

509 (ii) (iii) (iv) (v) 200, , , , C 57

榫 卯 是 什 麼? 何 時 開 始 應 用 於 建 築 中? 38 中 國 傳 統 建 築 的 屋 頂 有 哪 幾 種 形 式? 40 大 內 高 手 的 大 內 指 什 麼? 42 街 坊 四 鄰 的 坊 和 街 分 別 指 什 麼? 44 北 京 四 合 院 的 典 型 格 局 是 怎 樣 的

尿路感染防治.doc

Microsoft Word - MP2018_Report_Chi _12Apr2012_.doc

南華大學數位論文

李天命的思考藝術

皮肤病防治.doc

性病防治

中国南北特色风味名菜 _一)

全唐诗24

心理障碍防治(下).doc

(156) / Spurious Regression Unit Root Test Cointergration TestVector Error Correction Model Granger / /

穨_2_.PDF

14A 0.1%5% 14A 14A

(Chi)_.indb

<4D F736F F D205B345DB5D8AE4CACD AECAAFC5C1C9C1DCBDD0AB48A4CEB3F8A657AAED>

女性减肥健身(四).doc

Teaching kit_A4_part4.indd

建築污染綜合指標之研究

θ 1 = φ n -n 2 2 n AR n φ i = 0 1 = a t - θ θ m a t-m 3 3 m MA m 1. 2 ρ k = R k /R 0 5 Akaike ρ k 1 AIC = n ln δ 2

全唐诗28

穨學前教育課程指引.PDF

中医疗法(下).doc

眼病防治

中国南北特色风味名菜 _八)

九龍城區議會

mm mm

% 100% % 75% 14 (i)(ii) (iii) 2

第 2 頁 (a) 擔 任 機 場 擴 建 統 籌 辦 總 監 的 首 席 政 府 工 程 師 職 位 第 3 點 ) ; (b) 擔 任 ( 機 場 擴 建 統 籌 辦 ) 的 首 長 級 丙 級 政 務 官 職 位 ; 以 及 (c) 擔 任 總 助 理 ( 機 場 擴 建 統 籌 辦 ) 的

II

鹽 鹼 地 29 交 通 水 利 用 地 29 交 通 用 地 29 水 利 用 地 30 荒 蕪 地 30 荒 地 30 公 園 用 地 30 公 園 用 地 30 土 地 改 良 物 30 公 務 及 營 運 用 土 地 改 良 物 30 二 房 屋 建 築 及 設 分 類 明 細 表 房 屋

cgn


生活百科(二)


从零构建支持向量机(SVM)

(As at 28

中医疗法(上).doc


Microsoft Word - EDB Panel Paper 2016 (Chi)_finalr


(Microsoft Word \256\325\260\310\267|\304\263\254\366\277\375.doc)

女性健美保健(中).doc


-i-

Microsoft Word - 强迫性活动一览表.docx

Microsoft Word - Panel Paper on T&D-Chinese _as at __final_.doc

(Pattern Recognition) 1 1. CCD

厨房小知识(四)

妇女更年期保健.doc

小儿传染病防治(上)

<4D F736F F D B875B9B5A448ADFBBADEB27AA740B77EA4E2A5555FA95EAED6A641ADD75F2E646F63>

女性青春期保健(下).doc

避孕知识(下).doc

孕妇饮食调养(下).doc

禽畜饲料配制技术(一).doc

中老年保健必读(十一).doc

i

怎样使孩子更加聪明健康(七).doc

i

二零零六年一月二十三日會議

马太亨利完整圣经注释—雅歌

江苏宁沪高速公路股份有限公司.PDF

新婚夫妇必读(九).doc

建築物拆卸作業守則2004年

2. 我 沒 有 說 實 話, 因 為 我 的 鞋 子 其 實 是 [ 黑 色 / 藍 色 / 其 他 顏 色.]. 如 果 我 說 我 現 在 是 坐 著 的, 我 說 的 是 實 話 嗎? [ 我 說 的 對 還 是 不 對 ]? [ 等 對 方 回 答 ] 3. 這 是 [ 實 話 / 對 的

Microsoft Word - Paper on PA (Chi)_ docx

: 29 : n ( ),,. T, T +,. y ij i =, 2,, n, j =, 2,, T, y ij y ij = β + jβ 2 + α i + ɛ ij i =, 2,, n, j =, 2,, T, (.) β, β 2,. jβ 2,. β, β 2, α i i, ɛ i

<4D F736F F D203938BEC7A67EABD7B942B0CAC15AC075B3E6BF57A9DBA5CDC2B2B3B92DA5BFBD542E646F63>

2006產業管理創新研討會論文格式說明

<4D F736F F D B873ACECBDD2B57BBAF5AD6EC160C5E9AD70B565AED12E646F63>

Page i

捕捉儿童敏感期

绝妙故事

世界名画及画家介绍(四).doc

一 课 程 负 责 人 情 况 姓 名 吴 翊 性 别 男 出 生 年 月 基 本 信 息 学 位 硕 士 职 称 教 授 职 务 所 在 院 系 理 学 院 数 学 与 系 统 科 学 系 电 话 研 究 方 向 数 据 处 理 近 三 年 来

樹 木 管 理 專 責 小 組 報 告 人 樹 共 融 綠 滿 家 園

iv 不 必 詫 異, 其 實 成 功 與 失 敗 之 間 就 是 由 這 樣 簡 單 的 工 作 習 慣 造 成 的 可 見, 習 慣 雖 小, 卻 影 響 深 遠 遍 數 名 載 史 冊 的 成 功 人 士, 哪 位 沒 有 幾 個 可 圈 可 點 的 習 慣 在 影 響 着

(i) (ii) (i) (ii) O2O 1 86

綜合社會保障援助指引

<4D F736F F D20A4A4B0EAB371AB4FB3E65FA4A4A4E5AAA95F5F >

緒 言 董 事 會 宣 佈, 為 能 更 具 效 率 調 配 本 集 團 內 的 資 金 有 效 降 低 集 團 的 對 外 貸 款, 並 促 進 本 集 團 內 公 司 間 的 結 算 服 務, 於 2016 年 9 月 30 日, 本 公 司 中 糧 財 務 與 管 理 公 司 訂 立 財 務


群科課程綱要總體課程計畫書

untitled

目 录 院 领 导 职 责... 1 院 长 职 责... 1 医 疗 副 院 长 职 责... 1 教 学 副 院 长 职 责... 2 科 研 副 院 长 职 责... 2 后 勤 副 院 长 职 责... 3 主 管 南 院 区 副 院 长 职 责... 3 党 委 书 记 职 责... 4

PK IBM Warren McCulloch Walter Pits MP 1949 Hebb Hebb Hebb 145

家庭用药指南(九).doc

新婚夫妇必读(二十二).doc

(i) (ii) (iii) (iv) (v) (vi) (vii) (viii) (ix) (x) (xi) 60.99%39.01%

Transcription:

Air Visibility Forecasting via Artificial Neural Networks and Feature Selection Techniques

2003 7

Air Visibility Forecasting via Artificial Neural Networks and Feature Selection Techniques Sequential Floating Search Method RBF 7 RBF 92 i

i... ii... v... vii... 1 1.1 1 1.2 2 1.3 3... 5 2.1. 5 2.1.1... 6 2.1.2... 7 2.1.3..... 10 2.2....... 13 2.2.1 Trend... 13 2.2.2 ARMA Model. 19 2.3... 24 2.3.1 Sequential Backward Selection..... 25 ii

2.3.2 Sequential Forward Selection 25 2.3.3 Sequential Floating Search Method 29 RBF 33 3.1.... 33 3.2.... 34 3.3.... 38 3.4 Early Stop...... 41. 44 4.1.... 44 4.1.1.... 45 4.1.2.... 46 4.2 47 4.3.... 49 4.4.... 50 4.5.... 51..... 54 5.1... 54 5.2........ 55 5.2.1 SFSM..... 55 iii

5.2.2 SFSM SFS...... 57 5.2.3 SFSM..... 58 5.3 RBF... 62 5.4 MLP.. 64 5.5 92... 66 5.6 610.... 68. 69 6.1... 69 6.2....... 70. 72 iv

2.1 (0~2 km). 11 2.2 (2~8 km). 12 2.3 (>8 km)... 12 2.4 88 7... 13 2.5 84~88 PM 10.. 14 2.6 15 2.7 15 2.8 20... 16 2.9 84~88 PM 10... 17 2.10 84~88 SO 2. 17 2.11 84~88 NO 2. 18 2.12 84~88 O 3... 18 2.13 84~88 CO.. 19 2.14 88 ARMA(4,4) model... 21 2.15 89 ARMA(4,4) model... 21 2.16 90 ARMA(4,4) model... 22 2.17 91 ARMA(4,4) model... 22 2.18 91 ARMA(5,3) model... 23 2.19 Sequential Backward Selection... 26 2.20 Sequential Forward Selection.. 28 2.21 Sequential Floating Search Method. 30 2.22 Sequential Floating Search Method. 32 3.1.... 34 v

3.2.... 38 3.3.... 39 3.4... 40 3.5 Early stop... 41 3.6 RBF.... 43 4.1 45 4.2 45 4.3 53 5.1 x. 55 x2 3 5.2 SFSM 56 5.3 SFSM SFS SBS RBF.. 59 5.4 RBF 62 5.5 83~92.... 65 vi

2.1...... 8 4.1.... 44 4.2 13... 47 4.3 83~92... 49 4.4 83~92.... 50 4.5 83~92 50 4.6 /.. 51 5.1.... 54 5.2 54 5.3 SFS 57 5.4 SFSM. 57 5.5 SFS RBF 58 5.6 SFSM RBF 58 5.7 83~92 SFSM. 60 5.8 88~92 SFSM. 60 5.9 91~92 SFSM. 61 5.10 15. 61 5.11 83~92 SFSM RBF... 63 5.12 88~92 SFSM RBF... 63 5.13 91~92 SFSM RBF... 64 5.14 83~92... 64 5.15 88~92... 64 5.16 83~92.. 65 vii

5.17 83~92 MLP. 66 5.18 83~91 92.. 67 5.19 88~91 92.. 67 5.20 83~92... 67 5.21 88~92... 67 5.22 83~92 610 68 viii

1.1 (Neural Network, NN) (Pattern Recognition) (Function Approximation) (mapping) RBF (Radial Basis Function Neural Network) 1

RBF 1.2 89 (PSI) 100 11.74 % 10.20 % 4.05 % (PM 10 ) (extinction) (O 3 ) (1999~2000) 8 71.27 % 2~8 58.64 % 0~2 km 2~8 km 8 2

(Multiple Regression Model) RBF 1. 2. RBF 3. 1.3 3

RBF 4

(Feature Selection Method) 2.1 (1999) 6~12 3~5 5

2.1.1 1. (PM 10 ) 19,145 (30%) (16%) 11% 7~9% 2. (SO x ) 41.480 ( 95%) (1~2%) (37%) ( 23% 13%) 3. (NO x ) 52,902 ( 71%) ( 11% 12%) (24%) ( 22% 12%) 4. (NMHC) 60,146 ( 31 %) ( 6

21%) ( 14% 13%) 5. (CO) 372,425 ( 58%) ( 31%) 5% 2.1.2 ( ) 79 1 1 88 12 31 2.1 1. 1873.1 6~9 208.6~446.4 10~12 8.3~45.4 2. 25.1 7 29.1 1 19.4 3. 72 %~80 % 7

2.1 ( ) (m/s) (%) (mm) (hr) MJ/m 2 (mm) (bar) 19.4 72.3 2.33 N 160.3 283.3 95.4 1017.9 18.3 20.1 72.3 2.24 N 153.1 330.5 99.3 1017.4 30.1 22.9 73.8 2.22 N 156.7 367.2 131.2 1015.2 51.8 25.3 75.8 2.11 WNW 180.4 429.5 138.7 1012.5 111.1 27.3 76.9 2.15 WNW 194.0 450.8 160.6 1009.7 142.7 28.7 79.6 2.35 SSE 196.8 452.3 156.9 1007.7 446.4 29.1 78.3 2.42 SSE 209.1 466.4 173.9 1006.7 391.6 28.7 80.1 2.46 SSE 183.8 409.7 157.7 1006.1 407.7 28.1 78.1 2.18 W 165.4 402.3 139.9 1008.1 208.6 26.4 75.6 1.91 WNW 153.0 339.8 130.5 1012.5 45.4 24.1 73.0 1.87 N 146.2 278.9 112.6 1015.5 8.3 21.0 72.8 2.03 N 136.7 254.5 95.4 1018.4 11.1 25.1 75.7 2.19 N 169.6 372.1 132.7 1012.3 156.1 - - - - 2035.5-1592.1-1873.1 -- (1990 1999) 8

75.7% 8 80.1% 1 2 72.3% 4. 2035.5 7 209.1 12 136.7 4465.2MJ/m 2 7 466.4MJ/m 2 1 283.3 MJ/m 2 5. 2.19 m/s 1.9~2.5 m/s 8 2.46 m/s 11 1.87 m/s 6. 2.1.1 1592.1 7 173.9 1 12 95.4 7. 2.1.1 1012.3 12 1018.4 8 1006.1 8. 0~10 11 9

0~1.39 1.4~5.99 6~9 9 6.4 7 5.5 6 1 7.0 9. Pasquill D E F ( 57.2% ~ 60.5%) 10. (12 14 ) 800 ~ 1100 (4 6 ) 380-580 2.1.3 0~2 km( 2.1) 2~8 km( 2.2) 8 km ( 2.3) 11 00 14 00 10

88/89 ( ) 11 00 14 00 2.1 (0~2 km) 11

2.2 (2~8 km) 2.3 (>8 km) 12

2.2 t t 2.2.1 Trend (Trend) 1. 2.4 (day) 2.4 88 7 13

2. 2.5 PM 10 (12~2 ) (6~8 ) 160 84 85 86 87 88 140 120 PM 10 ( g/m 3 ) 100 80 60 40 20 0 1 2 3 4 5 6 7 8 9 10 11 12 2.5 84~88 PM 10 3. (Trend) 2.6 14

2.6 4. 2.7 2.7 15

2.8 20 2.8 20 (PM 10 ) (SO 2 ) (NO 2 ) (O 3 ) (CO) 2.9 2.13 2.10 16

160 84 85 86 87 88 140 120 PM 10 ( g/m 3 ) 100 80 60 40 20 0 1 2 3 4 5 6 7 8 9 10 11 12 2.9 84~88 PM 10 18 84 85 86 87 88 16 14 12 SO2 (ppb) 10 8 6 4 2 0 1 2 3 4 5 6 7 8 9 10 11 12 2.10 84~88 SO 2 17

40 84 85 86 87 88 35 30 NO 2 (ppb) 25 20 15 10 5 0 1 2 3 4 5 6 7 8 9 10 11 12 2.11 84~88 NO 2 45 84 85 86 87 88 40 35 30 O 3 (ppb) 25 20 15 10 5 0 1 2 3 4 5 6 7 8 9 10 11 12 2.12 84~88 O 3 18

1.0 84 85 86 87 88 0.9 0.8 0.7 CO (ppm) 0.6 0.5 0.4 0.3 0.2 0.1 0.0 1 2 3 4 5 6 7 8 9 10 11 12 2.13 84~88 CO 2.2.2 ARMA Model Analysis 1968 Box Jenkins Autoregressive and Moving Average (ARMA) Model Analysis (Autoregressive Model) (Moving Average Model) ARMA(p,q) p q 19

ARMA(p,q) 88~92 (2-1) Xt = α1xt 1+ α2xt 2 + + αpx t p + β Z + β Z + + β Z (2-1) 1 t 1 2 t 2 q t q X t t Z t white noise t α β p q p q α β ARMA(p,q) model MATLAB Statistics Toolbox ARMA(p,q) model 88 91 ARMA(p,q) p q ARMA(4,4) 2.14 2.17 ARMA model ARMA(4,4) model 20

2.14 88 ARMA(4,4) model 2.15 89 ARMA(4,4) model 21

2.16 90 ARMA(4,4) model 2.17 91 ARMA(4,4) model 22

2.17 ARMA(4,4) model ARMA(5,3) model 2.18 91 91 ARMA model 88~90 ARMA(4,4) ARMA(p,q) model 2.18 91 ARMA(5,3) model 23

2.3 (Feature Selection Method) 1. 2. 3. n 2 n 1 3 24

2.3.1 Sequential Backward Selection Sequential Backward Selection(SBS) top down SBS 2.19 SBS 4 2 Criterion Function(CF) { x, x, x, x } Criterion Function Value(CF ) MLP 1 2 3 4 [ x, x, x, x ] T 1 2 3 4 CF CF CF C( x 1, x 2, x 3) C( x 1, x 2, x 4 ) C( x 1, x 3, x 4 ) C( x 2, x 3, x 4 ) CF 3 { x1, x2, x3} { x, x, x } 1 2 3 C( x 1, x 2 ) C( x 1, x 3 ) C( x 2, x 3 ) CF { x1, x2} SFS SFS n m 1 + [( n+ 1) n m( m+ 1)]/2 2 n 1 2.3.2 Sequential Forward Selection Sequential Forward Selection(SFS) bottom up SBS 25

n { F1 ~ F n } * * C({ F1 ~ F n } X ) X { F ~ F n } { F ~ F X { ~ * 1 n } F } 1 Fn 1 1 n=n-1 n-1 m m { ~ } F1 F m 2.19 Sequential Backward Selection 26

SFS 2.20 SFS 4 3 { x, x, x, x } CF x 1 2 3 4 CF C( x 1, x 2 ) C( x 1, x 3 ) C( x 1, x 4 ) CF { x1, x3} CF { x, x, x } SFS 1 3 4 n m SFS nm m( m 1)/ 2 SBS SFS Sequential Backward Selection Sequential Forward Selection SFS SFS SBS SBS SFS Nesting Effect SBS SFS Nesting Effect SBS SFS Nesting Effect 1 27

n X ~ 1 X n CF k=1 F 1 k=k+1 CF { F ~ F + F } 1 k 1 k k m m { F ~ F m } 1 2.20 Sequential Forward Selection 28

2.3.3 Sequential Floating Search Method Sequential Floating Search Method (Pudil et.,1994) (SFSM) SBS SFS 2.21 SFSM SFS SBS k-1 CF k-1 k-1 SBS SBS SFS k SFS SFSM SFSM Inclusion Test Exclusion 2.22 n m Xk = { x1, x2,, x k } k Y n k n-k Inclusion x 1 = arg max C({ X, y}) x + k+ y Y n k k Y n k k 1 X k CF k+1 X = { X + x } k+ 1 k k+ 1 Test 1. xr = arg max x X C( X 1 k+ 1 { xr}) X k+ 1 SBS r k+ xr 2. r=k+1 k=k+1 SFS SBS k X k SFS 3. r k + 1 C ( Xk+ 1 { xr}) C( Xk) k=k+1 k k 29

Start Forward Selection Stop Criterion No Backward Selection Yes Stop No Test Yes 2.21 Sequential Floating Search Method Sequential Forward Selection Sequential Backward Selection Stop Criterion Test Backward Selection k k CF Backward Selection k k Backward Selection Forward Selection k+1 Forward Selection Sequential Floating Search Method 30

X k X SFS X k + 1 k SFS 4. k=2 X = X 1 { x } C( X ) = C( X { x }) k k+ r k-1 k=2 SBS 5. Exclusion ' 1. X = X x k k k+ 1 { r} ' 2. x = arg max ' C( X { y}) SBS s y X k ' k 3. C X ' ' ' ( k { xs}) C( Xk 1) Xk = X k C( Xk) = C( Xk) X k SBS ' 4. Xk = X x k=k-1 ' 1 k { s} 5. k=2 X = X C X = C X k ' ' k ( k) ( k) 6. k k k+ 1 r SFSM Nesting Effect SFSM 31

Floating Search Method xk+ 1 = argmax y Y n k C({ X, y}) k X = { X + x } k+ 1 k k+ 1 x = arg max C ( X { }) + x r r x X k r k+ 1 1 k =k+1 Yes Yes r=k+1 No C( X { x }) C( X k+ 1 r k) X k = Xk+ 1 { xr} CX ( ) = CX ( { x}) k k+ 1 r Yes No k = 2 No ' Xk = Xk+ 1 { x r } x s = y ' argmax C( Xk { }) y X ' k ' C( Xk { xs}) C( Xk 1 No ' ' X = k 1 X k { xs} k=k-1 ) Yes X = X ' k k ' ( k) = C( Xk) C X No k = 2 Yes 2.22 Sequential Floating Search Method 32

RBF RBF (Input Layer) (Hidden Layer) (Output Layer) (Activation Function) Chen and Billings(1992) (Orthogonal Least Squares Algorithm, OLS) 3.1 RBF (Perceptron) (Full Connected) 3.1 RBF (3-1) φ j 2 x c j x j = exp 2 ρ (3-1) ( c, ρ ) x c j RBF ρ (width) (Euclidean Norm) RBF (3-2) f R n R m f J ( ) yˆ = φ ( x c, ρ) θ i = 1, m x = (3-2) i i j j ji, j= 1 J RBF m ŷ i i θ ji j i 33

3.1 (3-2) RBF RBF f : x y RBF 3.2 Chen and Billings(1992) (OLS) RBF RBF (3-3) y ( t) i J = j= 1 φ ( t) θ + e ( t) 1 i m (3-3) j ji i 34

φ j j t θ ji y i e i (3-4) Y = F T + E (3-4) (3-5) (Objective Function) E N m 1 2 c y ki y ki (3-5) 2 (,θ ) = ( ˆ ) k= 1 i= 1 N m y ki ŷ ki k i OLS (Error Reduction Ratio, ERR) RBF (Least Squares,LS) (3-3) ji F [ ] F = WB (3-6) T W = w i j w = 0 B 1 w J 1 (3-7) i w j 35

1 β12 β1j 0 B = (3-7) β J 1J 0 1 W B Gram-Schmidt (Björck 1967) (3-8) (3-9) w 1 = F 1 (3-8) ß w = w F T ( w w ) T ik i k i i k k 1 = k F βikw i i= 1 1 i < k k = 2,..., J (3-5) (3-10) (3-11) Y WG + E (3-9) = (3-10) BT = G (3-11) OLS γˆ γˆ 11 1m ˆ T 1 T G = ( W W ) W Y = (3-12) γˆ J1 γˆ Jm T w j yi γ ˆji =, 1 j J, 1 i m (3-13) T ( w w ) j j OLS Ĝ Tˆ BTˆ = Gˆ (3-14) (3-14) Tˆ (3-5) E W Y (Covariance) trace T Y Y trace N = 1 N J m j= 1 i= 1 2 T 1 T γ ji w j w j + trace E E (3-15) N (3-16) 36

1 N m i=1 2 T γ ji w j w j (3-16) W j explained trace W j (3-17) [ err] j = m γ 2 ji w i = 1 trace T j w T ( Y Y ) j (3-17) OLS 1. j =1 2. k =1 (3-6) d 3. c j = x k k j RBF 4. (3-5) E 5. k = k +1 3 3 6. x k RBF 7. j = j +1 2 2 dmax ρ = (3-18) j + 1 d max max 37

OLS (1) (2) RBF (1) (2) (3) overfitting OLS 3.3 RBF activation function 3.2 OLS OLS (err) 3.2 3.2 C (MSE) (3-19) j 38

i= 1 2 1 N MSE y i yˆ N i (3-19) N yi desired output ŷ i output RBF 3.3 3.3 RBF (3-18) RBF RBF (classifier) (Nearest Neighbor, nn) 3.4 (prototype) 39

3.4 ( ) (3-20) new new old ρ = min center center (3-20) new new ρ cneter old cneter nn RB (3-18) 40

3.4 Early Stop Early Stop OLS 3.5 RBF (Noise) (Overfitting) RBF RBF (Cross-Validation) RBF (training set) (test set) (validation set) RBF RBF (Early Stop) 3.5 Early stop Early stop 41

Early stop Early stop k+1 k k+2 k k+2 early stop k 3.6 RBF 42

MSE 3.6 RBF 43

RBF 83 92 4 ( 87 ) 4.1 13 9 PM 10 O 3 NO 2 3 83/01/01~86/12/31 87/11/25~92/4/24 3074 3023 4.1 4.1 Index 1 2 3 4 5 6 7 Index 8 9 10 11 12 13 PM 10 O 3 NO 2 44

4.1.1 83 92 4 1. PM 10 O 3 NO 2 4 ( ) 2. 10 4.1 4.2 13.145 11.534 12.264 9.573 4.1. 4.2. 45

4.1.2 (Scaling) 4-1 L L max L min 0 ~ 1 (Rescaling) 4-2 D D max D min L L max L min 4-1 S = L L L L max min min (4-1) D D Dmax D min Rs = ( Lmax Lmin) + Lmin min S (scaling data) Rs (rescaling data) L min L max D max D min (4-2) 13 4.2 0 72.69 % 470.5 90 711 3022 150 mm 11 150 mm 150 mm 0~150 46

4.2 13 0 53 0 993.9 0 0.400 13 10 97 22.167 1025.7 12.5 6.433 360 ( ) (10th) (%) (MJ/m2) (hpa) (hr) (m/s) ( ) PM 10 O 3 NO 2 12.37 0 20.978 2.377 6.161 0 31.20 470.5 194.428 50.988 50.216 19.667 ( ) ( ) (mm) (ug/m3) (ppb) (ppb) (km) 4.2 RBF 13 Input X Designed output D X D (1) (2) 2 km 8km 0.1 0.398 Training data Validation data Early stop Testing data RBF 0.1 0.398 X y 0.1 i i 0.1 < 0.398 y 0.3998 y i i 47

RBF RBF 1. RBF SFSM 1 13 10 RBF 80% Training data 20% Validation data Training data 10 CF SFSM RBF Early stop Training data SFSM 2. RBF RBF SFSM RBF SFSM 80% Training data 10% Validation data 10% Testing data 100 Testing data 3. 92 RBF SFSM 92 92 Testing data 80 % Training data 20 % Validation data 100 48

RBF Testing data RBF (Learning Rate) (Momentum) 4.3 (3~5 ) (6~8 ) (9~11 ) (12~2 ) 4 83 92 374 340 ( 90.91%) 10~4 (10~4 ) (5~9 ( 6 ) 3 4.3 4.4 4.3 83~92 ( ) 0~2 km 374 69 15 109 181 34 340 2~8 km 1820 476 253 520 571 534 1286 >8 km 829 236 456 95 42 637 192 3023 781 724 724 794 1205 1818 49

4.4 83~92 ( %) 0~2 km 12.37 8.83 2.07 15.06 22.80 2.82 18.70 2~8 km 60.21 60.95 34.94 71.82 71.91 44.32 70.74 >8 km 27.42 30.22 62.98 13.12 5.29 52.86 10.56 RBF 3 4.4 83 ~92 4 24 87 1/1~11/23 4.5 (0~2 km) 83 0 (>8 km) 91 90 ARMA(p,q) model analysis 91 3 (1)83/1/1~92/4/24( 87/1/1~87/11/23)(2)88/1/1~92/4/24(3)91/1/1~92/4/24 4.6 4.5 83~92 ( ) 83 84 85 86 88 89 90 91 0~2 km 0 7 12 47 59 84 78 35 2~8 km 204 245 232 238 210 208 206 205 >8 km 139 113 122 80 96 74 81 105 50

4.6 / 83~92 88~92 91~92 0~2 km 374(12.37%) 282(18.24%) 61(13.56%) 2~8 km 1820(60.21%) 889(57.50%) 265(58.89%) >8 km 829(27.42%) 375(24.26%) 124(27.56%) 4.5 Sequential Floating Search Method(SFSM) RBF (1) 83/1/1~92/4/24( 87/1/1~87/11/23) (2) 88/1/1~92/4/24 (3) 91/1/1~92/4/24 7 (1) (2) (10~4 ) (3) (5~9 ) (4) (3~5 ) (5) (6~8 ) (6) (9~11 ) (7) (12~2 ) (1) (2) RBF 51

(3) 92 Multilayer Perceptron (MLP) SFSM 4.3 52

83~92 88~92 91~92 7 SFSM RBF 92 MLP 4.3 53

MLP 5.1 83~92/4/24 Sequential Floating Search Method ( 5.2 ) RBF Learning rate=0.1 Momentum=0.0 80% Training data 10% Validation data 10% Testing data 5.1 5.2 5.1 Training Validation Testing 2401 300 300 71.7244% 71.5779% 71.0649% 5.2 Training Validation Testing 2401 300 300 73.688% 73.1867% 72.9733% 54

5.2 SFSM 7 SFSM 5.2.1 SFSM 13 SFSM SFSM 10 12 Graph Class 1 Class 2 8 4 Feature 3 0-4 -8-12 -12-8 -4 0 4 8 Feature 2 12 5.1 x x 2 3 55

{ x, x,, x } x } 1 80% } 85% 1 2 10 { { x, x { x, x, x } 90% { x7, x8,x 9, x10} 95% { x, x 2 3 } 4 5 6 5.1 { x { x } 2 } 3 100% (Nearest Neighbor, nn) SFSM 5.2 index SFSM 2 3 { x } 1 { x, x } { x, x, x, x } { x, x, x } { x, x SFS 4 5 6 2 3 7 8 9 10 2 3 } { x, x, x } 4 5 6 73.10% 1 77.55% 1 2 80.55% 2 3 82.90% 2 3 1 85.20% 2 3 1 8 86.20% 2 3 1 8 7 87.60% 2 3 1 8 7 9 89.20% 8 7 9 10 90.50% 8 7 9 10 1 90.95% 8 7 9 10 1 4 91.55% 8 7 9 10 1 4 5 92.95% 8 7 9 10 1 4 5 2 93.45% 8 7 9 10 1 4 5 2 6 94.10% 8 7 9 10 1 4 5 2 6 3 5.2 SFSM 56

SBS SFSM 5.2.2 SFSM SFS SFS SFSM 88 90 SFS SFSM 5.3 5.4 SFS 5.3 SFS 1 2 3 4 5 6 7 O 3 PM 10 NO 2 O 3 PM 10 5.4 SFSM 1 2 3 4 5 6 PM 10 O 3 RBF 80% Training data 10% Validation data 10% Testing data 100 Testing data 5.5 5.6 SFSM SFS 6 SFS 7 SFSM SFS Nesting Effect 57

5.5 SFS RBF RBF (SFS) Input Training Validation Testing SFS Dimension 7 72.4879% 72.4537% 72.5278% 5 72.2798% 72.4167% 72.3796% 5.6 SFSM RBF RBF (SFSM) Input Training Validation Testing SFSM Dimension 6 72.7514% 73.3519% 73.5463% 5.2.3 SFSM SFSM 4.2 5.3 5.7 5.8 5.9 SFSM SFSM CF RBF 10 5.10 O 3 15 12 58

SFS SBS CF 80% Training data 20% Validation data RBF 10 10 RBF CF SFS SBS SFSM 5.3 SFSM RBF 59

5.7 83~92 SFSM 83~92 O 3 73.0196% NO 2 71.1897% O 3 O 3 67.8261% 82.3386% O 3 80.7937% 70.6889% O 3 PM 10 O 3 NO 2 78.5714% 5.8 88~92 SFSM 88~92 69.4634% O 3 NO 2 PM 10 O 3 69.2604% PM 10 66.7007% PM 10 77.0863% PM 10 O 3 NO 2 76.4688% 65.3527% O 3 NO 2 73.5963% PM 10 O 3 60

5.9 91~92 SFSM 91~92 PM 10 O 3 NO 2 72.3099% 5.10 15 Index 1 2 3 4 5 6 7 6 6 10 8 7 4 9 Index 8 9 10 11 12 13 PM 10 O 3 NO 2 8 8 7 12 6 7 61

5.3 RBF 5.2 RBF 4.2 RBF 5.4 83~92 88~92 91~92 RBF 5.11 5.12 5.13 83~92 88~92 83~82 88~92 5.7 5.8 83~92 SFSM 88~92 SFSM 80% Training data 10% Validation data 10% Testing data RBF 100 100 RBF 5.4 RBF 62

5.11 83~92 SFSM RBF Training Validation Testing 2401 300 300 73.688% 73.1867% 72.9733% 10~4 1443 180 180 78.7006% 79.1333% 78.9667% 5-9 958 120 120 65.3507% 65.0583% 64.9417% 3~5 622 78 78 71.4823% 71.7692% 69.6154% 6~8 575 72 72 64.8504% 64.8611% 63.25% 9~11 572 72 72 78.9213% 78.25% 79.1528% 12~1 630 79 79 80.0603% 79.4177% 79.2911% 5.12 88~92 SFSM RBF Training Validation Testing 1230 154 154 70.7244% 70.5779% 70.0649% 10~4 747 94 94 74.3829% 74.4468% 74.4787% 5-9 483 60 60 65.0083% 64.3667% 63.4% 3~5 338 42 42 70.8432% 70.2619% 71.3571% 6~8 294 37 37 61.8741% 61.3243% 58.9459% 9~11 278 35 35 78.8669% 79.8286% 78.4286% 12~1 320 40 40 75.6688% 75.325% 75.8% 63

5.13 91~92 SFSM RBF Training Validation Testing 356 44 44 71.8507% 71.0909% 71.25% { } { { } 83~92 88~92 5.14 5.15 5.14 83~92 300 300 301 72.9733% 73.3567% 72.9136% 5.15 88~92 154 154 154 70.0649% 70.1623% 71.1363% 5.4 MLP 5.11 5.12 83~92 88~92 5.16 83~92 64

12.37% 27.42% 7 8 27.42% 8~19 83~92 5.5 5.16 83~92 ( %) 0~2 km 12.37 8.83 2.07 15.06 22.80 2.82 18.70 2~8 km 60.21 60.95 34.94 71.82 71.91 44.32 70.74 >8 km 27.42 30.22 62.98 13.12 5.29 52.86 10.56 Training : Class 1 / 0 /13 Class 2 / 302 /469 Class 3 / 332 /476 Validation : Class 1 / 0 /2 Class 2 / 42 /63 Class 3 / 32 /55 Testing : Class 1 / 0 /2 Class 2 / 36 /58 Class 3 / 45 /60 5.5 83~92 65

MLP 5.17 5.17 83~92 MLP Training Validation Testing 5-9 483 60 60 60.3918% 66.0678% 59.5763% 6~8 294 37 37 59.6351% 67.0556% 58.9167% 5.5 92 5.2 92 4.2 92 92 1/1~4/24 Testing data 80 % Training data 20 % Validation data 100 RBF Testing data 92 1/1~4/24 5.18 5.19 RBF 5.3 5.20 5.21 66

5.18 83~91 92 Training Validation Testing 2310 578 113 73.1299% 72.5761% 65.1416% 10~4 1352 338 113 78.4283% 78.3047% 67.1858% 3~5 579 145 54 70.7237% 70.6% 58.1667% 12~1 583 146 59 80.3482% 79.7603% 72.1864% 5.19 88~91 92 Training Validation Testing 1140 285 113 68.9491% 68.9404% 65.9204% 10~4 658 164 113 73.1945% 72.6768% 67.3894% 3~5 294 74 54 68.7211% 67.4324% 62.2037% 12~1 273 68 59 75.967% 74.9559% 72.7119% 5.20 83~92 and 113 113 113 65.1416% 67.1858% 65.4867% 5.21 88~92 and 113 113 113 65.9204% 67.3894% 67.6903% 67

5.6 610 RBF RBF RBF 83 92 100 RBF RBF 610 10 Testing data 600 500 Training data 100 Validation data 1 610 100 610 1 100 610 100 83~92 SFSM 100 5.22 5.22 83~92 610 Training Validation Testing 610 500 100 10 69.344% 68.52% 68.7% 68

6.1 (1) (2)RBF (3)92 MLP 1. Sequential Floating Search Method Sequential Forward Selection Sequential Backward Selection Nesting Effect 2. SFSM n ( N) stop criterion k (>n) n 3. input 4. RBF { } { 69

} 5. 83~92 88~92 6. 92 (1) (2) (3) 7. MLP 610 6.2 1. 13 13 2. SFSM RBF 10 CF 70

Validation data 3. 4. RBF MLP Averaging Committee AdaBoost committee SFSM CF 71

G.E.P. Box, G.M. Jenkins, and G.C. Reinsel., 1994, Time Series Analysis, Forecasting and Control, 3 edition, Ed. Englewood Cliffs: Prentice Hall. G.Zhang, B.E. Patuwo, and M.Y.Hu, Forecasting with Artificial Neural Networks: The State of The Art, Int. J. Forecast., vol. 14, pp.35-62, 1998. J. Barry Gomm, and Ding Li Yu, Selecting Radial Basis Function Network Centers with Recursive Orthogonal Least Squares Training, IEEE Transactions on Neural Networks, vol.11, pp.306-314, 2000. J.T. Connor and R.D. Martin., 1994, Recurrent Neural Networks and Robust Time Series Prediction., IEEE Transactions of Neural Networks, 2(5):240-253, 1994. P. A. Devijver and J. Kittler., 1982, Pattern Recognition: A Statistical Approach., Prentice-Hall. P. Pudil, J. Novovičová, and J. Kittler., Floating search methods in feature selection., Pattern Recognition Letters, 15(11):1119 1125, 1994. T. M. Cover and P. E. Hart, Nearest neighbor pattern classification, IEEE Trans. Inform. Theory, vol. IT-13, pp.21-27, 1967. 72

,,,, 1998, RBF,,, 2001,,,, 2002,,,,,,,, 1999 73