Resumen
In recent years, deep neural networks have made substantial progress in object recognition. However, one issue with deep learning is that it is currently unclear which proposed framework is exaggerated for a specific hitch. As a result, distinct dispositions are attempt before one that produces satisfactory results is discovered. This paper described a distributed supervised learning method for finding the best network architecture by modifying specifications for a perceived task dynamically. In the case of the MNIST information gathering, it is shown that asynchronous supervised learning can agree on a solution space. Setting several hyperparameters can be time-consuming when constructing neural networks. In this post, we'll provide you with some tips and instructions for better organizing your hyperparameter tuning process, which should help you find a good setting for the hyperparameters much faster.
Idioma original | Inglés |
---|---|
Título de la publicación alojada | Proceedings - 2nd International Conference on Smart Electronics and Communication, ICOSEC 2021 |
Editorial | Institute of Electrical and Electronics Engineers Inc. |
Páginas | 1587-1593 |
Número de páginas | 7 |
ISBN (versión digital) | 9781665433686 |
DOI | |
Estado | Publicada - 2021 |
Publicado de forma externa | Sí |
Evento | 2nd International Conference on Smart Electronics and Communication, ICOSEC 2021 - Trichy, India Duración: 7 set. 2021 → 9 set. 2021 |
Serie de la publicación
Nombre | Proceedings - 2nd International Conference on Smart Electronics and Communication, ICOSEC 2021 |
---|
Conferencia
Conferencia | 2nd International Conference on Smart Electronics and Communication, ICOSEC 2021 |
---|---|
País/Territorio | India |
Ciudad | Trichy |
Período | 7/09/21 → 9/09/21 |
Nota bibliográfica
Publisher Copyright:© 2021 IEEE.