Options
Nik Adilah Hanin Zahri
Preferred name
Nik Adilah Hanin Zahri
Official Name
Nik Adilah Hanin , Zahri
Alternative Name
Zahri, Nik Adilah Hanin Binti
Zahri, N. A.H.
Hanin, Nik Adilah
Zahri, Nik Adilah Hanin
Adilah Hanin Zahri, Nik
Main Affiliation
Scopus Author ID
57191919794
Researcher ID
GJQ-4994-2022
Now showing
1 - 1 of 1
-
PublicationEvaluating Tree-based Ensemble Strategies for Imbalanced Network Attack Classification( 2024-01-01)
;Soon H.F. ;Nishizaki H.With the continual evolution of cybersecurity threats, the development of effective intrusion detection systems is increasingly crucial and challenging. This study tackles these challenges by exploring imbalanced multiclass classification, a common situation in network intrusion datasets mirroring realworld scenarios. The paper aims to empirically assess the performance of diverse classification algorithms in managing imbalanced class distributions. Experiments were conducted using the UNSW-NB15 network intrusion detection benchmark dataset, comprising ten highly imbalanced classes. The evaluation includes basic, traditional algorithms like the Decision Tree, KNearest Neighbor, and Gaussian Naive Bayes, as well as advanced ensemble methods such as Gradient Boosted Decision Trees (GraBoost) and AdaBoost. Our findings reveal that the Decision Tree surpassed the Multi-Layer Perceptron, K-Nearest Neighbor, and Naive Bayes in terms of overall F1-score. Furthermore, thorough evaluations of nine tree-based ensemble algorithms were performed, showcasing their varying efficacy. Bagging, Random Forest, ExtraTrees, and XGBoost achieved the highest F1-scores. However, in individual class analysis, XGBoost demonstrated exceptional performance relative to the other algorithms. This is confirmed by achieving the highest F1-scores in eight out of the ten classes within the dataset. These results establish XGBoost as a predominant method for handling multiclass imbalance classification with Bagging being the closest feasible alternative, as Bagging gains an almost similar accuracy and F1-score as XGBoost.3