Slotting Learning Rate in Deep Neural Networks to Build Stronger Models

Dilip Kumar Sharma, Bhopendra Singh, Mamoona Anam, Klinge Orlando Villalba-Condori, Ankur Kumar Gupta, Ghassan Khazal Ali

Resultado de la investigación: Capítulo del libro/informe/acta de congresoContribución a la conferenciarevisión exhaustiva

Resumen

In recent years, deep neural networks have made substantial progress in object recognition. However, one issue with deep learning is that it is currently unclear which proposed framework is exaggerated for a specific hitch. As a result, distinct dispositions are attempt before one that produces satisfactory results is discovered. This paper described a distributed supervised learning method for finding the best network architecture by modifying specifications for a perceived task dynamically. In the case of the MNIST information gathering, it is shown that asynchronous supervised learning can agree on a solution space. Setting several hyperparameters can be time-consuming when constructing neural networks. In this post, we'll provide you with some tips and instructions for better organizing your hyperparameter tuning process, which should help you find a good setting for the hyperparameters much faster.

Idioma originalInglés
Título de la publicación alojadaProceedings - 2nd International Conference on Smart Electronics and Communication, ICOSEC 2021
EditorialInstitute of Electrical and Electronics Engineers Inc.
Páginas1587-1593
Número de páginas7
ISBN (versión digital)9781665433686
DOI
EstadoPublicada - 2021
Publicado de forma externa
Evento2nd International Conference on Smart Electronics and Communication, ICOSEC 2021 - Trichy, India
Duración: 7 set. 20219 set. 2021

Serie de la publicación

NombreProceedings - 2nd International Conference on Smart Electronics and Communication, ICOSEC 2021

Conferencia

Conferencia2nd International Conference on Smart Electronics and Communication, ICOSEC 2021
País/TerritorioIndia
CiudadTrichy
Período7/09/219/09/21

Nota bibliográfica

Publisher Copyright:
© 2021 IEEE.

Huella

Profundice en los temas de investigación de 'Slotting Learning Rate in Deep Neural Networks to Build Stronger Models'. En conjunto forman una huella única.

Citar esto