Efficient Commercial Classification of Agricultural Products using Convolutional Neural Networks

Ali Jebelli, Rafiq Ahmad


Agricultural products, as essential commodities, are among the most sought-for items in superstores. Barcode is usually utilized to classify and regulate the price of products such as ornamental flowers in such stores. However, the use of barcodes on some fragile agricultural products such as ornamental flowers can be damaged and lessen their life length. Moreover, it is time-consuming and costly and may lead to the production of massive waste and damage to the environment and the admittance of chemical materials into food products that can affect human health. Consequently, we aimed to design a classifier robot to recognize ornamental flowers based on the related product image at different times and surrounding conditions. Besides, it can increase the speed and accuracy of distinguishing and classifying the products, lower the pricing time, and increase the lifetime due to the absence of the need for movement and changing the position of the products. According to the datasheets provided by the robot that is stored in its database, we provide the possibility of identifying and introducing the product in different colors and shapes. Also, due to the preparation of a standard and small database tailored to the needs of the robot, the robot will be trained in a short time (less than five minutes) without the need for an Internet connection or a large hard drive for storage the data. On the other hand, by dividing each input photo into ten different sections, the system can, without the need for a detection system, simultaneously in several different images, decorative flowers in different conditions, angles and environments, even with other objects such as vases, detects very fast with a high accuracy of 97%.


Convolutional Neural Network, Visual Geometry Group, Classification


L. Catarinucci, S. Tedjini, R. Colella, F. P. Chietera, K. Zannas and D. Kaddour, “3D-Printed barcodes as RFID tags,” 2020 International Workshop on Antenna Technology (iWAT), Bucharest, Romania, 2020, pp. 1-4.

V. Uzun and S. Bilgin, Evaluation and implementation of QR code identity tag system for healthcare in Turkey,” SpringerPlus 5, 1454 (2016).

J.-S. Park, Y.-O. Lee, C.-Y. Park, Y-S. Kim, K.-B. Nahm, and J.-D. Kim, “Barcode recognition for multistrip lateral flow assay reader,” Sensors and Materials, Vol. 32, No. 8(2), 2020, pp. 3669-3678.

R. Ahmad, S. Tichadou, J.Y. Hascoet, New computer vision based snakes and ladders algorithm for the safe trajectory of two axis CNC machines Computer-Aided Design, 44 (2012), pp. 355-366

A.F. Isnawati, M.A. Afandi, and J. Hendry, “Performance analysis of FBMC-OQAM system for barcode and QR code image transmission,” IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT), Bali, Indonesia, 2020, pp. 162-166.

R. Ahmad, S. Tichadou, J.Y. Hascoet, 3D Safe and intelligent trajectory generation for multi-axis machine tools using machine vision Computer Integrated Manufacturing, 26 (4) (2013), pp. 365-385

A. Jebelli, M.C.E. Yagoub, “Efficient robot vision system for underwater object tracking,” IEEE International Conference on Control Science and Systems Engineering (CCSSE 2016), Singapore, Singapore, pp. 1-4.

M.-L. Gao, W.-J. Wang, L. Liu, Z.-B. Han, N. Wei, X.-M. Cao, and D.-Q. Yuan,” Microporous hexanuclear Ln(III) cluster-based metal–organic frameworks: Color tunability for barcode application and selective removal of methylene blue,” Inorganic Chemistry 2017 56 (1), 511-517


Artzai Picon, Maximiliam Seitz, Aitor Alvarez-Gila, Patrick Mohnke, Amaia Ortiz-Barredo, Jone Echazarra, “Crop conditional Convolutional Neural Networks for massive multi-crop plant disease classification over cell phone acquired images taken on real field conditions,” Computers and Electronics in Agriculture, Volume 167, 2019.

Z. Lin et al., “A Unified Matrix-Based Convolutional Neural Network for Fine-Grained Image Classification of Wheat Leaf Diseases,” in IEEE Access, vol. 7, pp. 11570-11590

S. Albawi, T. A. Mohammed, and S. Al-Zawi, “Understanding of a convolutional neural network,” 2017 International Conference on Engineering and Technology (ICET), Antalya, Turkey, 2017, pp. 1-6.

P. Wang and D. Wang, “Filter-and-convolve: A Cnn based multichannel complex concatenation acoustic model,” 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Calgary, AB, Canada, 2018, pp. 5564-5568.

B.B. Traore, B. Kamsu-Foguem, and F. Tangara. “Deep convolution neural network for image recognition.” Ecological Informatics 48 (2018): 257-268.

S. Li, W. Song, L. Fang, Y. Chen, P. Ghamisi and J. A. Benediktsson, “Deep Learning for Hyperspectral Image Classification: An Overview,” in IEEE Transactions on Geoscience and Remote Sensing, vol. 57, no. 9, pp. 6690-6709.

S. J. Pan and Q. Yang, "A Survey on Transfer Learning," in IEEE Transactions on Knowledge and Data Engineering, 2010, vol. 22, no. 10, pp. 1345-1359.

T. Chuanqi, F. Sun, T. Kong, W. Zhang, Ch. Yang, and Ch, Liu. "A survey on deep transfer learning." In 27th International conference on artificial neural networks,. Springer,Rhodes, Greece, 2018, pp. 270-279

Tu, Keling, Shaozhe Wen, Ying Cheng, Tingting Zhang, Tong Pan, Jie Wang, Jianhua Wang, and Qun Sun. “A non-destructive and highly efficient model for detecting the genuineness of maize variety, ‘JINGKE 968′ using machine vision combined with deep learning,” Computers and Electronics in Agriculture 182 (2021): 106002.

J. J. Sanchez-Castro et al., “A lean convolutional neural network for vehicle classification,” 2020 IEEE 29th International Symposium on Industrial Electronics (ISIE), Delft, Netherlands, 2020, pp. 1365-1369.

S. Rohrmanstorfer, M. Komarov, and F. Mödritscher, “Image classification for the automatic feature extraction in human worn fashion data” Mathematics 9, 2021, no. 6: 624.

DOI: http://doi.org/10.11591/ijra.v10i4.pp%25p


  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

IJRA Visitor Statistics