Deep dictionary learning with reconstruction for texture recognition

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Texture recognition underpins critical applications in industrial quality control, robotic manipulation, and biomedical imaging. Traditional deep dictionary learning methods for texture recognition often emphasize deep feature extraction. However, they tend to lose crucial features as model depth increases, which can reduce their overall effectiveness. To address this issue, we propose a dictionary-reconstruction-based deep learning approach by incorporating a novel hybrid fusion method designed to enhance the accuracy of texture recognition. Our approach involves the successive fusion of multimodality and multi-level features. By reconstructing dictionaries learned at different levels, we integrate both deep and intuitive features. Additionally, we introduce a grouping optimization technique, based on single-sample learning, to train these reconstructed dictionaries, thereby improving feature learning and training efficiency. The proposed approach fuses feature data from various multimodal sources and constructs dictionaries at different learning levels, which enables effective feature fusion across these levels. We evaluate our approach against recent deep learning methods by using the LMT-108 and SpectroVision datasets. The results demonstrate its 97.7% and 89.4% accuracy rates, respectively, outperforming its peers and validating its robustness when handling diverse and challenging data.

Original languageEnglish (US)
Article number31164
JournalScientific reports
Volume15
Issue number1
DOIs
StatePublished - Dec 2025

All Science Journal Classification (ASJC) codes

  • General

Keywords

  • Deep dictionary learning
  • Dictionary reconstruction
  • Feature fusion
  • Texture recognition

Fingerprint

Dive into the research topics of 'Deep dictionary learning with reconstruction for texture recognition'. Together they form a unique fingerprint.

Cite this