USP Electronic Research Repository

Enhancing neural network classification using fractional - order activation functions

Kumar, Meshach and Mehta, Utkal V. and Cirrincione, Giansalvo (2024) Enhancing neural network classification using fractional - order activation functions. AI Open, 5 . pp. 10-22. ISSN 2666-6510

[thumbnail of FOAF ENN AI Open 2024.pdf]
Preview
PDF
Download (1MB) | Preview

Abstract

In this paper, a series of novel activation functions is presented, which is derived using the improved Riemann–Liouville conformable fractional derivative (
CFD). This study investigates the use of fractional activation functions in Multilayer Perceptron (MLP) models and their impact on the performance of classification tasks, verified using the IRIS, MNIST and FMNIST datasets. Fractional activation functions introduce a non-integer power exponent, allowing for improved capturing of complex patterns and representations. The experiment compares MLP models employing fractional activation functions, such as fractional sigmoid, hyperbolic tangent and rectified linear units, against traditional models using standard activation functions, their improved versions and existing fractional functions. The numerical studies have confirmed the theoretical observations mentioned in the paper. The findings highlight the potential usage of new functions as a valuable tool in deep learning in classification. The study suggests incorporating fractional activation functions in MLP architectures can lead to superior accuracy and robustness.

Item Type: Journal Article
Subjects: T Technology > TK Electrical engineering. Electronics Nuclear engineering > Robotics and Automation
Divisions: School of Information Technology, Engineering, Mathematics and Physics (STEMP)
Depositing User: Utkal Mehta
Date Deposited: 01 Feb 2024 03:11
Last Modified: 01 Feb 2024 23:21
URI: https://repository.usp.ac.fj/id/eprint/14396

Actions (login required)

View Item View Item