
Please use this identifier to cite or link to this item:
http://localhost:8080/xmlui/handle/123456789/1956
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chanana, Dr. Garima | - |
dc.date.accessioned | 2025-05-19T09:27:42Z | - |
dc.date.available | 2025-05-19T09:27:42Z | - |
dc.date.issued | 2025 | - |
dc.identifier.issn | https://doi.org/10.1016/j.chemphys.2024.112591 | - |
dc.identifier.uri | http://localhost:8080/xmlui/handle/123456789/1956 | - |
dc.description.abstract | Deep learning has significantly advanced molecular property prediction, with Message-Passing Graph Neural Networks (MPGNN) standing out as an effective method. This study systematically evaluates the performance of ten activation functions — Sigmoid, Tanh, ReLU, Leaky ReLU, ELU, SELU, Softmax, Swish, Mish, and GeLU — using the MPGNN model on the QM9 dataset. It aims to identify the most suitable activation functions for training neural networks for specific molecular properties. The study examines electronic properties such as HOMO, LUMO, HOMO-LUMO energy gap, dipole moment, and polarizability, as well as thermal properties like zero-point vibrational energy and specific heat capacity. The findings reveal that different activation functions excel for different properties: SELU for HOMO, ELU for LUMO, Sigmoid for the HOMO-LUMO gap, Mish for polarizability, GELU for ZPVE, and Leaky ReLU for dipole moment and specific heat capacity. These insights are vital for optimizing MPGNN design for targeted molecular property prediction. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Chemical Physics | en_US |
dc.title | Performance analysis of activation functions in molecular property prediction using Message Passing Graph Neural Networks. | en_US |
dc.type | Article | en_US |
Appears in Collections: | VSE&T |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
garima.docx | 190.73 kB | Microsoft Word XML | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.