| Issue |
EPJ Web Conf.
Volume 360, 2026
1st International Conference on “Quantum Innovations for Computing and Knowledge Systems” (QUICK’26)
|
|
|---|---|---|
| Article Number | 01017 | |
| Number of page(s) | 14 | |
| DOI | https://doi.org/10.1051/epjconf/202636001017 | |
| Published online | 23 March 2026 | |
https://doi.org/10.1051/epjconf/202636001017
A Quantum-Inspired Attention Network with Orthogonal Polynomials for Medical Image Classification
1 Department of CSE, Vel Tech Rangarajan Dr.Sagunthala R&D Institute of Science and Technology, Chennai, Tamil Nadu, India.
2,3 Department of CSE(AI&ML), Vel Tech Rangarajan Dr.Sagunthala R&D Institute of Science and Technology, Chennai, Tamil Nadu, India.
* e-mail:This email address is being protected from spambots. You need JavaScript enabled to view it.
** e-mail:This email address is being protected from spambots. You need JavaScript enabled to view it.
*** e-mail:This email address is being protected from spambots. You need JavaScript enabled to view it.
Published online: 23 March 2026
Abstract
In this research, an Orthogonal Polynomials Transformation and Quantum Inspired Attention Network (QIAN) based medical image classification framework called QuAntNet is presented. It integrates low level image features are automatically extracted from brain MRI images with expert level clinical insights to identify diagnosis precise classes. The proposed method extracts fusion edge, texture and deep level features from the image based on OPT and Convolutional Neural Network techniques. The extracted classical features are submitted to quantum-inspired attention network mechanism for analyzing intricate relationships between different discriminative image features. The QuAntNet is validated on reputed brain MRI datasets namely BraTS2020 and Figshare, which includes different classes of brain tumors. Experiment conducted on BraTS2020 dataset, the QuAntNet method establishes overall accuracy upto 98.7%, sensitivity upto 98.2%, specificity upto 99.0% and F1-score of 98.4%. Also, when valiadated on Figshare, it produces accuracy of 99.3%, sensitivity of 98.9%, specificity of 99.6% and F1-score of 99.1%. Experimental outcomes clearly validates that there is significant enhancement in classification accuracy incorporating fusion edge, texture and deep features and QIAN methods enhances the classification perfomance in the QuAntNet framework.
Key words: Brain tumor diagnosis / Low level features / Orthogonal polynomials transformation / Quantum feature fusion / Quantum-inspired attention networks / Quantum support vector machine
© The Authors, published by EDP Sciences, 2026
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.

