As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Deep manifold learning has achieved significant success in handling visual tasks by using Symmetric Positive Definite (SPD) matrices, particularly within multi-scale submanifold networks (MSNet). This network is capable of extracting a series of main diagonal submatrices from SPD matrices. However, these submanifolds do not take into account the distribution of the submanifolds themselves. To address this limitation and introduce batch normalization tailored to submanifolds, we devise a submanifold-specific normalization approach that incorporates submanifold distribution information. Additionally, for submanifolds mapped into Euclidean space, considering the weight relationships between different submanifolds, we propose an attention mechanism tailored for log mapped submanifolds, termed submanifold attention. Submanifold attention is decomposed into multiple 1D feature encodings. This approach enables the capture of dependencies between different submanifolds, thus promoting a more comprehensive understanding of the data structure. To demonstrate the effectiveness of this method, we conducted experiments on various visual databases. Our results indicate that this approach outperforms the MSNet.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.