Home
  • English
  • ÄŒeÅ¡tina
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • LatvieÅ¡u
  • Magyar
  • Nederlands
  • Português
  • Português do Brasil
  • Suomi
  • Log In
    New user? Click here to register. Have you forgotten your password?
Home
  • Browse Our Collections
  • Publications
  • Researchers
  • Research Data
  • Institutions
  • Statistics
    • English
    • ÄŒeÅ¡tina
    • Deutsch
    • Español
    • Français
    • Gàidhlig
    • LatvieÅ¡u
    • Magyar
    • Nederlands
    • Português
    • Português do Brasil
    • Suomi
    • Log In
      New user? Click here to register. Have you forgotten your password?
  1. Home
  2. Resources
  3. UniMAP Index Publications
  4. Publications 2021
  5. A study of extreme learning machine on small sample-sized classification problems
 
Options

A study of extreme learning machine on small sample-sized classification problems

Journal
Journal of Physics: Conference Series
ISSN
17426588
Date Issued
2021-12-01
Author(s)
Ooi B.P.
Norasmadi Abdul Rahim
Universiti Malaysia Perlis
Maz Jamilah Masnan
Institute of Engineering Mathematics
Ammar Zakaria
Universiti Malaysia Perlis
DOI
10.1088/1742-6596/2107/1/012013
Abstract
Extreme learning machine (ELM) is a special type of single hidden layer feedforward neural network that emphasizes training speed and optimal generalization. The ELM model proposes that the weights of hidden neurons need not be tuned, and the weights of output neurons can be calculated by finding the Moore-Penrose generalized inverse method. Thus, the ELM classifier is suitable to use in a homogeneous ensemble model due to the untuned random hidden weights which promote diversity even with the same training data. This paper studies the effectiveness of the ELM ensemble models in solving small sample-sized classification problems. The research involves two variants of the ensemble model: the normal ELM ensemble with majority voting (ELE), and the random subspace method (RS-ELM). To simulate the small sample cases, only 30% of the total data will be used as the training data. Experiment results show that the RS-ELM model can outperform a multi-layer perceptron (MLP) model under the assumptions of a Friedman test. Furthermore, the ELE model has similar performance as an MLP model under the same assumptions.
Funding(s)
Ministry of Higher Education, Malaysia
File(s)
research repository notification.pdf (4.4 MB)
altmetric
0
CITATIONS
0 total citations on Dimensions.
0 Total citations
0 Recent citations
n/a Field Citation Ratio
n/a Relative Citation Ratio
dimensions
Views
1
Acquisition Date
Nov 18, 2024
View Details
google-scholar
Downloads
  • About Us
  • Contact Us
  • Policies

We collect and process your personal information for the following purposes: Authentication, Preferences, Acknowledgement and Statistics.
To learn more, please read our
privacy policy.

Customize