Multi-gate Weighted Fusion Network for neuronal morphology classification

Front Neurosci. 2024 Nov 8:18:1322623. doi: 10.3389/fnins.2024.1322623. eCollection 2024.

Abstract

Analyzing the types of neurons based on morphological characteristics is pivotal for understanding brain function and human development. Existing analysis approaches based on 2D view images fully use complementary information across images. However, these methods ignore the redundant information caused by similar images and the effects of different views on the analysis results during the fusion process. Considering these factors, this paper proposes a Multi-gate Weighted Fusion Network (MWFNet) to characterize neuronal morphology in a hierarchical manner. MWFNet mainly consists of a Gated View Enhancement Module (GVEM) and a Gated View Measurement Module (GVMM). GVEM enhances view-level descriptors and eliminates redundant information by mining the relationships among different views. GVMM calculates the weights of view images based on the salient activated regions to assess their influence on the analysis results. Furthermore, the enhanced view-level features are fused differentially according to the view weight to generate a more discriminative instance-level descriptor. In this way, the proposed MWFNet not only eliminates unnecessary features but also maps the representation differences of views into decision-making. This can improve the accuracy and robustness of MWFNet for the identification of neuron type. Experimental results show that our method achieves accuracies of 91.73 and 98.18% on classifying 10 types and five types of neurons, respectively, outperforming other state-of-the-art methods.

Keywords: hierarchical descriptors; morphological representation; multiple views; neuronal morphology analysis; weighted fusion.

Grants and funding

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This work was supported by the Anhui Provincial Natural Science Foundation under Grant 2108085UD12.