Slotting Learning Rate in Deep Neural Networks to Build Stronger Models

Dilip Kumar Sharma, Bhopendra Singh, Mamoona Anam, Klinge Orlando Villalba-Condori, Ankur Kumar Gupta, Ghassan Khazal Ali

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In recent years, deep neural networks have made substantial progress in object recognition. However, one issue with deep learning is that it is currently unclear which proposed framework is exaggerated for a specific hitch. As a result, distinct dispositions are attempt before one that produces satisfactory results is discovered. This paper described a distributed supervised learning method for finding the best network architecture by modifying specifications for a perceived task dynamically. In the case of the MNIST information gathering, it is shown that asynchronous supervised learning can agree on a solution space. Setting several hyperparameters can be time-consuming when constructing neural networks. In this post, we'll provide you with some tips and instructions for better organizing your hyperparameter tuning process, which should help you find a good setting for the hyperparameters much faster.

Original languageEnglish
Title of host publicationProceedings - 2nd International Conference on Smart Electronics and Communication, ICOSEC 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1587-1593
Number of pages7
ISBN (Electronic)9781665433686
DOIs
StatePublished - 2021
Externally publishedYes
Event2nd International Conference on Smart Electronics and Communication, ICOSEC 2021 - Trichy, India
Duration: 7 Sep 20219 Sep 2021

Publication series

NameProceedings - 2nd International Conference on Smart Electronics and Communication, ICOSEC 2021

Conference

Conference2nd International Conference on Smart Electronics and Communication, ICOSEC 2021
Country/TerritoryIndia
CityTrichy
Period7/09/219/09/21

Bibliographical note

Publisher Copyright:
© 2021 IEEE.

Keywords

  • Artificial intelligence
  • Convolutional neural network
  • Deep learning
  • Hyper-parameter
  • Hyperparameter tuning
  • Machine learning
  • Neural networks
  • Orthogonal array

Fingerprint

Dive into the research topics of 'Slotting Learning Rate in Deep Neural Networks to Build Stronger Models'. Together they form a unique fingerprint.

Cite this