Home
  • English
  • ÄŒeÅ¡tina
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • LatvieÅ¡u
  • Magyar
  • Nederlands
  • Português
  • Português do Brasil
  • Suomi
  • Log In
    Have you forgotten your password?
Home
  • Browse Our Collections
  • Publications
  • Researchers
  • Research Data
  • Institutions
  • Statistics
    • English
    • ÄŒeÅ¡tina
    • Deutsch
    • Español
    • Français
    • Gàidhlig
    • LatvieÅ¡u
    • Magyar
    • Nederlands
    • Português
    • Português do Brasil
    • Suomi
    • Log In
      Have you forgotten your password?
  1. Home
  2. Resources
  3. Journals
  4. Applied Mathematics and Computational Intelligence (AMCI)
  5. Mitigating Overfitting in Extreme Learning Machine Classifier Through Dropout Regularization
 
Options

Mitigating Overfitting in Extreme Learning Machine Classifier Through Dropout Regularization

Journal
Applied Mathematics and Computational Intelligence (AMCI)
ISSN
2289-1315
Date Issued
2024-02-14
Author(s)
Fateh Alrahman Kamal Qasem Alnagashi
Universiti Malaysia Perlis
Norasmadi Abdul Rahim
Universiti Malaysia Perlis
Shazmin Aniza Abdul Shukor
Universiti Malaysia Perlis
Mohamad Hanif Abd Hamid
Universiti Malaysia Perlis
DOI
https://doi.org/10.58915/amci.v13iNo.1.561
Handle (URI)
https://ejournal.unimap.edu.my/index.php/amci/article/view/561/355
https://hdl.handle.net/20.500.14170/15118
Abstract
Achieving optimal machine learning model performance is often hindered by the limited availability of diverse datasets, a challenge exacerbated by small sample sizes in real-world scenarios. In this study, we address this critical issue in classification tasks by integrating the Dropout technique into the Extreme Learning Machine (ELM) classifier. Our research underscores the effectiveness of Dropout-ELM in mitigating overfitting, especially when data is scarce, leading to enhanced generalization capabilities. Through extensive experiments on synthetic and real-world datasets, our findings consistently demonstrate that Dropout-ELM outperforms traditional ELM, yielding significant accuracy improvements ranging from 0.19% to 16.20%. By strategically implementing dropout during training, we promote the development of robust models that reduce reliance on specific features or neurons, resulting in increased adaptability and resilience across diverse datasets. Ultimately, Dropout-ELM emerges as a potent tool to counter overfitting and bolster the performance of ELM-based classifiers, particularly in scenarios with limited data. Its established efficacy positions it as a valuable asset for enhancing the reliability and generalization of machine learning models, providing a robust solution to the challenges posed by constrained training data.
Subjects
  • Artificial datasets

  • Classification

  • Dropout

  • Machine Learning

  • Real-world datasets

  • Small sample-sized

  • Test Accuracy

File(s)
26-35+Mitigating+Overfitting+in+Extreme+Learning+Machine+Classifier+Through+Dropout+Regularization.pdf (368.88 KB)
Downloads
15
Last Month
1
Acquisition Date
Mar 5, 2026
View Details
Views
1
Acquisition Date
Mar 5, 2026
View Details
google-scholar
  • About Us
  • Contact Us
  • Policies