Ciência da Computação
URI Permanente desta comunidadehttps://repositorio.cesupa.br/handle/prefix/19
Navegar
Navegando Ciência da Computação por Autor "Barros, Lucas Lima de Aragão"
Agora exibindo 1 - 1 de 1
- Resultados por página
- Opções de Ordenação
Item Otimização de hiperparâmetros de redes neurais por inferência Bayesiana com enxame de partículas(Centro Universitário do Estado do Pará, 2023-12-07) Barros, Lucas Lima de Aragão; Souza, Daniel Leal; Mollinetti, Marco Antônio Florenzano; http://lattes.cnpq.br/1246642827046945; http://lattes.cnpq.br/6059334260016388; Nascimento, Polyana Santos Fonseca; http://lattes.cnpq.br/6889523334917369; Oliveira, Roberto Célio Limão de; http://lattes.cnpq.br/4497607460894318This work explores - through hyperparameter optimization techniques - the possibility of improving the effectiveness of neural networks, which have experienced widespread popularity in various fields. The focus is to evaluate a new approach to hyperparameter optimization, which is one of the main challenges in the development of these networks. Traditionally, the definition of these hyperparameters occurs stochastically or through mathematical calculations. The goal is to refine the process, allowing a more precise definition of values, with lower losses. These solutions are based on a review of the literature on hyperparameter optimization in Neural Networks (e.g., CNN, Fully Connected), addressing fundamental concepts and techniques such as Grid Search, Random Search, and SMBO. The methodology includes the choice of tools, cross-validation strategy, and specific search approaches, such as Gaussian Regression or grid search. The detailed experiments involve optimizing the hyperparameters of Neural Networks, presenting datasets, model configurations, training protocols, and quantitative results. Despite its limitations and the need for further studies, this work demonstrates the feasibility and potential of combining advanced techniques for hyperparameter optimization in Neural Networks, offering contributions and stimulating new research directions in the search for greater efficacy in various applications of these networks. The discussion analyzes the effects of hyperparameter settings, identifies limitations, and addresses possible reasons for the obtained results. Finally, the conclusion summarizes the main insights, specific contributions, and future directions, while the references ensure the work's foundation.