Robust Classification of Black-Eyed Peas Based on Segment Anything Model and Transfer Learning

Main Article Content

Sachin Sonawane, Suresh Kurumbanshi

Abstract

Evaluating the physical quality of harvested black-eyed peas is essential to ensure their products meet high standards. Carefully designed and optimized machine learning models can provide better quality evaluation. A hybrid neural network integrating EfficientNetV2B1 and Vision Transformer (ViT) to classify black-eyed peas is introduced in this work. One of the main challenges in accurate classification was segmenting objects in a clustered view. Inconsistent lighting, variations in sample size, random placement of the objects, and neighboring objects touching each other make the task difficult. We utilized the Segment Anything Model (SAM) to address the issue. SAM detected individual objects for our samples of weights up to 30 grams with 100% accuracy. We incorporated SAM with our custom object retrieval block to separate the segmented objects into images of size 224 × 224 × 3 pixels for classification purposes. We also used image augmentation with a stable diffusion method to balance the dataset. Stable diffusion generates high-quality and diverse images while preserving the original distribution. Subsequently, we experimented with five hybrid architectures, EfficientNetV2B1+ViT, MobileNetV2+ShuffleNetV2, ResNet50+DenseNet121, VGG16+ResNet18, and InceptionV3+MobileNetV2 with feature fusion addressed by the convolutional block attention module (CBAM). Our experimentation showed that the EfficientNetV2B1+ViT model outperformed other models. EfficientNetV2B1+ViT exploited depth-wise separable convolution and transformer-based models utilizing multi-head self-attention mechanisms. With hyperparameter optimization, EfficientNetV2B1+ViT achieved an impressive accuracy of 95.80% and a loss of 0.1256 across eight classes of sound-quality seeds, defects, and foreign contamination, highlighting its efficiency and robustness.

Article Details

Section
Articles