一種SVM參數優化的自適應追蹤灰狼優化算法
首發時間:2025-08-21
王菲(2000-),女,碩士研究生,主要研究方向:優化理論與方法
汪勇 1汪勇(1967-),男,教授,主要研究方向:機器學習,系統優化與決策
張洪 1張洪(1981-),女,教授,研究方向:價值共創
摘要:灰狼優化和序列最小優化算法求解非線性支持向量機參數時,優化精度不高且穩定性較差。在灰狼優化算法基礎上,提出一種自適應追蹤灰狼優化算法,該算法包含自適應算子和追蹤算子。根據上下代頭狼位置變化并考慮到三個等級狼的權重,自適應算子自動調整需移動位置的灰狼以提高算法的探索能力。追蹤算子控制灰狼是否進行追蹤、追蹤方向和追蹤距離以提高算法的開發能力。為滿足拉格朗日目標函數約束條件,設計灰狼個體的映射編碼。實驗表明,設計的映射編碼有效地解決了等式約束問題,提出的算法求解拉格朗日目標函數的最小值穩定且優于灰狼優化及其改進算法,分類準確率較SMO算法有較大改善。超參數倍數從0.01增至10.00時,目標函數最小值呈近似線性負向增大,雖能提升分類性能,但過高倍數會導致數值不穩定,需在性能提升與數值穩定性間取得平衡。
關鍵詞: 非線性支持向量機 參數優化 序列最小優化 灰狼優化 高斯核函數
For information in English, please click here
Adaptive Grey Wolf Optimizer for SVM Hyperparameter Tuning
王菲(2000-),女,碩士研究生,主要研究方向:優化理論與方法
WANG Yong 1汪勇(1967-),男,教授,主要研究方向:機器學習,系統優化與決策
ZHANG Hong 1張洪(1981-),女,教授,研究方向:價值共創
Abstract:Standard Grey Wolf Optimizer (GWO) and Sequential Minimal Optimization (SMO) algorithms demonstrate limited optimization accuracy and poor stability when solving parameters for nonlinear Support Vector Machines. This paper proposes an Adaptive Tracking Grey Wolf Optimizer (ATGWO) based on the GWO framework, incorporating both adaptive and tracking operators. The adaptive operator automatically adjusts the movement positions of grey wolves by analyzing positional changes between consecutive generations of leading wolves and considering weight factors among three hierarchical wolf levels, thereby enhancing the algorithm\'s exploration capability. The tracking operator controls whether wolves should track, as well as their tracking direction and distance, to improve the algorithm\'s exploitation ability. To satisfy the constraints of the Lagrangian objective function, a specialized mapping encoding scheme for grey wolf individuals is designed. Experimental results demonstrate that the proposed mapping encoding effectively resolves equality constraint problems, and the ATGWO algorithm achieves more stable and superior minimization of the Lagrangian objective function compared to both standard GWO and its variants, while significantly improving classification accuracy over the SMO algorithm. When the hyperparameter multiplier increases from 0.01 to 10.00, the minimum objective function value shows an approximately linear negative growth, which can enhance classification performance. However, excessively high multipliers may cause numerical instability, necessitating a balance between performance improvement and numerical stability.
Keywords: Nonlinear Support Vector Machine Parameter Optimization Sequential Minimal Optimization Grey Wolf Optimizer Gaussian Kernel Function
引用

No.****
動態公開評議
共計0人參與
勘誤表
一種SVM參數優化的自適應追蹤灰狼優化算法
評論
全部評論