Sharma, Anuraganand (2018) Guided Stochastic Gradient Descent Algorithm for inconsistent datasets. Applied Soft Computing, 73 . 1068 - 1080. ISSN 1568-4946
Preview |
PDF
Download (2MB) | Preview |
Abstract
Stochastic Gradient Descent (SGD) Algorithm, despite its simplicity, is considered an effective and default standard optimization algorithm for machine learning classification models such as neural networks and logistic regression. However, SGD's gradient descent is biased towards the random selection of a data instance. In this paper, it has been termed as data inconsistency. The proposed variation of SGD, Guided Stochastic Gradient Descent (GSGD) Algorithm, tries to overcome this inconsistency in a given dataset through greedy selection of consistent data instances for gradient descent. The empirical test results show the efficacy of the method. Moreover, GSGD has also been incorporated and tested with other popular variations of SGD, such as Adam, Adagrad and Momentum. The guided search with GSGD achieves better convergence and classification accuracy in a limited time budget than its original counterpart of canonical and other variation of SGD. Additionally, it maintains the same efficiency when experimented on medical benchmark datasets with logistic regression for classification.
Item Type: | Journal Article |
---|---|
Uncontrolled Keywords: | Stochastic Gradient Descent Algorithm, Machine learning, Classification, Logistic regression, Neural networks, Greedy selection, Guided Stochastic Gradient Descent Algorithm |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science Q Science > QA Mathematics > QA76 Computer software |
Divisions: | Faculty of Science, Technology and Environment (FSTE) > School of Computing, Information and Mathematical Sciences |
Depositing User: | Anuraganand Sharma |
Date Deposited: | 17 Jul 2019 22:36 |
Last Modified: | 17 Jul 2019 22:36 |
URI: | https://repository.usp.ac.fj/id/eprint/11610 |
Actions (login required)
View Item |