Home
  • English
  • ÄŒeÅ¡tina
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • LatvieÅ¡u
  • Magyar
  • Nederlands
  • Português
  • Português do Brasil
  • Suomi
  • Log In
    New user? Click here to register. Have you forgotten your password?
Home
  • Browse Our Collections
  • Publications
  • Researchers
  • Research Data
  • Institutions
  • Statistics
    • English
    • ÄŒeÅ¡tina
    • Deutsch
    • Español
    • Français
    • Gàidhlig
    • LatvieÅ¡u
    • Magyar
    • Nederlands
    • Português
    • Português do Brasil
    • Suomi
    • Log In
      New user? Click here to register. Have you forgotten your password?
  1. Home
  2. Resources
  3. UniMAP Index Publications
  4. Publications 2023
  5. Tuberculosis Classification Using Deep Learning and FPGA Inferencing
 
Options

Tuberculosis Classification Using Deep Learning and FPGA Inferencing

Journal
Journal of Advanced Research in Applied Sciences and Engineering Technology
Date Issued
2023-02-01
Author(s)
Fazrul Faiz Zakaria
Universiti Malaysia Perlis
Asral Bahari Jambek
Universiti Malaysia Perlis
Norfadila Mahrom
Universiti Malaysia Perlis
Rafikha Aliana A Raof
Universiti Malaysia Perlis
Mohd Nazri Mohd Warip
Universiti Malaysia Perlis
Al Eh Kan P.L.
Mustapa M.
DOI
10.37934/araset.29.3.105114
Abstract
Among the top 10 leading causes of mortality, tuberculosis (TB) is a chronic lung illness caused by a bacterial infection. Due to its efficiency and performance, using deep learning technology with FPGA as an accelerator has become a standard application in this work. However, considering the vast amount of data collected for medical diagnosis, the average inference speed is inadequate. In this scenario, the FPGA speeds the deep learning inference process enabling the real-time deployment of TB classification with low latency. This paper summarizes the findings of model deployment across various computing devices in inferencing deep learning technology with FPGA. The study includes model performance evaluation, throughput, and latency comparison with different batch sizes to the extent of expected delay for real-world deployment. The result concludes that FPGA is the most suitable to act as a deep learning inference accelerator with a high throughput-to-latency ratio and fast parallel inference. The FPGA inferencing demonstrated an increment of 21.8% in throughput while maintaining a 31% lower latency than GPU inferencing and 6x more energy efficiency. The proposed inferencing also delivered over 90% accuracy and selectivity to detect and localize the TB.
Funding(s)
Ministry of Higher Education, Malaysia
Subjects
  • Deep-learning | FPGA ...

File(s)
research repository notification.pdf (4.4 MB) research repository notification.pdf (4.4 MB)
Views
2
Acquisition Date
Nov 19, 2024
View Details
google-scholar
Downloads
  • About Us
  • Contact Us
  • Policies