Options
Amiza Amir
Preferred name
Amiza Amir
Official Name
Amiza, Amir
Main Affiliation
Scopus Author ID
36170326400
Researcher ID
EKV-8568-2022
Now showing
1 - 3 of 3
-
PublicationGain Enhancement of CPW Antenna for IoT Applications using FSS with Miniaturize Unit Cell( 2021-07-26)
;Azhari M.S.B.A.Jiunn N.K.Wireless connectivity is a critical enabler for many IoT applications. Antennas are often required to be installed inside the device cover, which usually occurs in small sizes with optimal performance. On the other hand, a suitable antenna should also have high efficiency, gain and adequate bandwidth covering the desired frequency range. Here, we proposed new type of Frequency Selective Surface (FSS) with miniaturized resonator element to enhance the gain of an CPW antenna. Furthermore, the miniaturization of the Frequency Selective Surface unit cell is attained by coupling the two meandered wire resonators. The wire resonator is separated by thin and single substrate layer. The structure of the FSS is shown to have a FSS unit cell dimension that is miniaturized to 0.057λ. The CPW antenna size is only 28.8mm × 46.5mm operating at 2.45 GHz frequency. With the additional of the FSS, the antenna's gain reaches up from 1.8 dBi to 2.6 dBi with omnidirectional radiation pattern. -
PublicationThe Performance Analysis of K-Nearest Neighbors (K-NN) Algorithm for Motor Imagery Classification Based on EEG Signal( 2017-12-11)
;Nurul E’zzati Md IsaMost EEG-based motor imagery classification research focuses on the feature extraction phase of machine learning, neglecting the crucial part for accurate classification which is the classification. In contrast, this paper concentrates on the classifier development where it thoroughly studies the performance analysis of k-Nearest Neighbour (k-NN) classifier on EEG data. In the literature, the Euclidean distance metric is routinely applied for EEG data classification. However, no thorough study has been conducted to evaluate the effect of other distance metrics to the classification accuracy. Therefore, this paper studies the effectiveness of five distance metrics of k-NN: Manhattan, Euclidean, Minkowski, Chebychev and Hamming. The experiment shows that the distance computations that provides the highest classification accuracy is the Minkowski distance with 70.08%. Hence, this demonstrates the significant effect of distance metrics to the k-NN accuracy where the Minknowski distance gives higher accuracy compared to the Euclidean. Our result also shows that the accuracy of k-NN is comparable to Support Vector Machine (SVM) with lower complexity for EEG classification. -
PublicationEvaluating Tree-based Ensemble Strategies for Imbalanced Network Attack Classification( 2024-01-01)
;Soon H.F. ;Nishizaki H.With the continual evolution of cybersecurity threats, the development of effective intrusion detection systems is increasingly crucial and challenging. This study tackles these challenges by exploring imbalanced multiclass classification, a common situation in network intrusion datasets mirroring realworld scenarios. The paper aims to empirically assess the performance of diverse classification algorithms in managing imbalanced class distributions. Experiments were conducted using the UNSW-NB15 network intrusion detection benchmark dataset, comprising ten highly imbalanced classes. The evaluation includes basic, traditional algorithms like the Decision Tree, KNearest Neighbor, and Gaussian Naive Bayes, as well as advanced ensemble methods such as Gradient Boosted Decision Trees (GraBoost) and AdaBoost. Our findings reveal that the Decision Tree surpassed the Multi-Layer Perceptron, K-Nearest Neighbor, and Naive Bayes in terms of overall F1-score. Furthermore, thorough evaluations of nine tree-based ensemble algorithms were performed, showcasing their varying efficacy. Bagging, Random Forest, ExtraTrees, and XGBoost achieved the highest F1-scores. However, in individual class analysis, XGBoost demonstrated exceptional performance relative to the other algorithms. This is confirmed by achieving the highest F1-scores in eight out of the ten classes within the dataset. These results establish XGBoost as a predominant method for handling multiclass imbalance classification with Bagging being the closest feasible alternative, as Bagging gains an almost similar accuracy and F1-score as XGBoost.3