Real-Time Plant Disease Detection ML
Real-Time Plant Disease Detection ML
A Project Report
Submitted
In Partial Fulfillment of the Requirements
For the Degree of
We hereby declare that the project work presented in this report entitled “Real Time Disease
Identification in Plants Using Machine Learning”, in partial fulfillment of the
requirement for the award of the degree of Bachelor of Technology in Computer Science
& Engineering, submitted to Dr. A.P.J. Abdul Kalam Technical University, Uttar Pradesh,
Lucknow is based on our own work carried out at Department of Computer Science &
Engineering, G.L. Bajaj Institute of Technology & Management, Greater Noida. The work
contained in the report is true and original to the best of our knowledge and project work
reported in this report has not been submitted by us for award of any other degree or diploma.
Signature:
Name: Aditi Dutt
Signature:
Name: Anshika Goyal
Signature:
Name: Aashi Chaudhary
Signature:
Name: Lakshya Bhati
Date:
Place: Greater Noida
ii
Certificate
This is to certify that the Project report entitled “Real Time Disease Identification in
Plants Using Machine Learning” done by Aditi Dutt (2101920100017), Anshika Goyal
Science & Engineering, G.L. Bajaj Institute of Technology & Management, Greater Noida
under my guidance. The matter embodied in this project work has not been submitted earlier
for the award of any degree or diploma to the best of my knowledge and belief.
Date:
iii
Acknowledgement
The merciful guidance bestowed to us by the almighty made us stick out this project to a
successful end. We humbly pray with sincere heart for his guidance to continue forever.
We pay thanks to our project guide Ms. Abha Kaushik who has given guidance and light to
us during this project. Her versatile knowledge has helped us in the critical times during the span
of this project.
We pay special thanks to our Head of Department Dr. Sansar Singh Chauhan who has been
always present as a support and help us in all possible way during this project.
We also take this opportunity to express our gratitude to all those people who have been directly
and indirectly with us during the completion of the project.
We want to thanks our friends who have always encouraged us during this project.
At the last but not least thanks to all the faculty of CSE department who provided valuable
suggestions during the period of project
iv
Abstract
Plant diseases are a major threat to global agriculture, leading to reduced crop yields and
economic instability especially in regions where farming is the primary livelihood. Timely and
accurate disease detection is essential for minimizing these losses. This project proposes a real-
time plant disease identification system using MobileNetV2, a lightweight convolutional neural
network designed for efficient deployment on web and mobile platforms. A total of 21 classes
were created from images of mango, potato, and eggplant leaves, representing both healthy and
diseased conditions, including biotic (caused by pathogens) and abiotic (caused by
environmental stress) disorders. The dataset was pre processed with image resizing,
normalization, and augmentation to enhance variability and model robustness. Using
MobileNetV2 with transfer learning, the model was trained to classify these conditions and
achieved an accuracy of 74.39%. While not state-of-the-art, this performance demonstrates a
meaningful step toward real-time field deployment on low-resource devices. The system is
deployed as an interactive Streamlit web application that allows users to upload images of plant
leaves, receive instant disease predictions, and access relevant treatment suggestions. By
integrating deep learning with an accessible web interface, this project offers a scalable solution
to assist farmers in early detection of plant diseases, enabling more efficient crop management
and contributing to the advancement of smart agriculture
.
v
TABLE OF CONTENT
Chapter 1. Introduction…………………………………………………………… 9 - 15
1.1 Preliminaries.......................................................................................... 9
1.2 Problem Statement ……………………………………………………. 13
1.3 Motivation …………………………………......................................... 13
1.4 Objectives …………………………………………………………….. 14
Chapter 2 Literature Survey................................................................................ 16 – 34
2.1 Introduction ………………………………………………………….. 16
2.2 Research Gap ......................................................................................... 26
2.3 Dataset Collection ………….................................................................. 27
Chapter 3. Proposed Methodology………………………………………………. 35 – 40
3.1 Introduction …………………………………………………………… 35
3.2 Problem Formulation …………………………………………………. 35
3.3 Proposed Work ……………………………………………………… 36
Chapter 4. Implementation…................................................................................. 41 – 44
4.1 Introduction ……………………………………………………………. 41
4.2 Implementation Strategy (Flowchart, Algorithm etc.) …………………. 41
4.3 Tools/Hardware/Software Requirements..……………………………… 43
4.4 Expected Outcome …………………………………….………………. 44
Chapter 5. Result & Discussion ……………………….......................................... 45 – 49
5.1 Result Overview ………………………………………………………… 45
5.2 Model Performance Evaluation ……………….……………………… 45
5.3 Visual Results and Web Application output …………………………… 47
5.4 Discussion ………………………………………………………… 48
Chapter 6. Conclusion & Future Scope.………………………............................ 50 - 51
6.1 Conclusion ……………………………………………………………. 50
6.2 Future Scope ……………………………………………………………. 50
References …………………………………………………………………………… 52 – 55
vi
LIST OF FIGURES
vii
LIST OF TABLES
viii
Chapter 1
Introduction
1.1 Preliminaries
Agriculture plays a vital role in supporting economies around the world, essential not only for
ensuring food security but also for supplying raw materials to various industries[1]. It supports
the livelihood of billions and provides critical resources such as food crops, fibers, and other
plant-based products. However, a persistent challenge faced by the agricultural sector is the
prevalence of plant diseases. Plant diseases can significantly impact crop production, leading
to reduced food supplies and economic hardship especially in areas where farming is the main
source of livelihood.[2]
Traditionally, identifying plant diseases has mostly been a manual process, relying on farmers
and experts to inspect plants by hand. This involves spotting symptoms on leaves and stems,
which can be slow, subjective, and not feasible on larger farms. Moreover, because many plant
diseases share similar signs, there’s a higher chance of misdiagnosis. This could lead to
unnecessary pesticide use or delays in taking action both of which can negatively impact the
environment and lower crop yields.
With recent advancements in Machine Learning (ML) and Deep Learning (DL), particularly
in image recognition, plant disease detection has become significantly more efficient. Models
such as MobileNetV2, a lightweight and efficient convolutional neural network, have proven
effective in identifying diseases from leaf images with high accuracy[3]. MobileNetV2 is
especially suitable for deployment in mobile and web-based applications due to its balance
between performance and computational efficiency[4]. These models can detect early signs of
disease in mango, potato, and eggplant leaves often before they are noticeable to the human
eye enabling faster and more reliable decision-making. While the integration of ML in
agriculture is still emerging, its impact is growing rapidly. Automated disease detection
9
empowers farmers with timely insights, reducing the spread of infections and minimizing crop
loss. This shift from manual to AI-assisted diagnosis marks a crucial step toward sustainable
and technology-driven farming.
To make these innovations accessible, web-based platforms are being developed where users
can upload leaf images for instant analysis. Such systems democratize access to advanced
agricultural tools, especially for small-scale farmers or those in remote areas. By offering
intuitive interfaces and reliable disease predictions, these applications enable better crop
protection and foster data-driven agricultural practices. There are two types of disease in plants
as shown in Figure 1.1:
1.1.1 Biotic diseases: These are caused by living organisms such as fungi, bacteria, viruses,
and nematodes. Biotic stress typically manifests as visible symptoms like leaf spots,
blights, rots, and wilting. Accurate classification of biotic diseases requires datasets
containing clear, labelled examples of each pathogen-induced symptom as shown in
Table 1.1.
10
Table 1.1 - Biotic Plant Disease
Water-soaked
Bacterial
1 Bacteria lesions on leaves,
Blight
dark streaks, wilting
White powdery
Powdery
2 Fungi coating on leaves,
Mildew
stems, and flowers
Yellowing leaves,
3 Root Rot Fungi stunted growth,
mushy roots
Small rust-coloured
4 Rust Fungi spots on leaves
and stems
Yellow patches on
Downy leaves, white
5 Fungi
Mildew growth on
undersides
Dark spots on
6 Leaf Spot Bacteria/Fungi leaves with yellow
halos, leaf drop
Sudden wilting,
7 Wilt Disease Bacteria/Fungi yellowing leaves,
stunted growth
Mottled yellow or
green leaves,
8 Viral Mosaic Virus
distorted growth,
reduced yield
11
1.1.2 Abiotic diseases: These are caused by non-living factors such as nutrient deficiencies,
chemical exposure, drought, temperature extremes, or pollution. Unlike biotic diseases,
abiotic stress symptoms often resemble those caused by pathogens, which can complicate
classification. These include yellowing, browning, scorching, or irregular leaf shapes as
shown in Table 1.2.
Blackened or
Frost scorched leaves,
3 Low temperatures
Damage wilting, stunted
growth
White or brown
patches on leaves,
4 Sunscald Excess sunlight
usually on sun-
exposed areas
Root suffocation,
Water-
7 Excess water yellowing leaves,
logging
wilting
12
1.2 Problem Statement
The agricultural sector continues to face critical challenges due to the widespread occurrence
of plant diseases, which severely impact crop yield, food quality, and farmer income. Timely
and precisely identification of plant diseases is crucial for effective treatment and crop
management. Traditional approaches, such as manual visual inspections, can be slow, error-
prone, and impractical for large-scale monitoring. These methods often fail to handle the
volume of data required for comprehensive plant health management. This leads to delayed
interventions, misuse of agrochemicals, and significant economic losses.
In countries where agriculture is the main way of making a living and access to agricultural
expertise is limited, there is an urgent need for scalable, accurate, and user-friendly solutions.
Leaf-based visual symptoms are a common and accessible way to diagnose plant diseases, yet
interpreting them correctly requires specialized knowledge that many farmers do not have.
The core problem this project aims to address is the lack of accessible and efficient plant
disease diagnosis tools for key crops such as mango, potato, and eggplant. By leveraging the
lightweight MobileNetV2 deep learning architecture, this project seeks to develop a web-based
application that can automatically identify diseases from leaf images. This system is designed
to provide farmers and agricultural workers with real-time, accurate disease predictions and
solutions, ultimately reducing crop loss and improving sustainable farming practices.
1.3 Motivation
Early spotting of plant diseases is crucial to stop them from spreading and causing serious
harm, but traditional methods relying on manual inspection are often slow, subjective, and
ineffective at scale. Visual similarities between different diseases can make diagnosis difficult
even for experienced farmers. Moreover, limited access to agricultural experts and diagnostic
13
tools in rural areas further delays timely interventions, leading to lower yields and increased
economic losses.
This project aims to tackle these issues by harnessing modern, user-friendly technology.
Thanks to advancements in Artificial Intelligence (AI) and Machine Learning (ML),
particularly in image classification, we can now create tools that automatically detect plant
diseases through mobile apps or web platforms. Among various models, MobileNetV2 stands
out for being lightweight and optimized for performance on devices with limited computing
power, making it ideal for real-world agricultural applications.
The project aims to harness the capabilities of MobileNetV2 to identify diseases in mango,
potato, and eggplant leaves in real time. By allowing users to simply upload an image of a
diseased leaf, the system can detect the disease and provide relevant treatment suggestions
instantly. This enables timely decision-making and significantly reduces the dependency on
manual diagnosis or expert consultation.
Furthermore, the motivation extends beyond technical advancement it’s about making this
solution accessible and inclusive. By building a user-friendly web interface with multilingual
voice and text support, the system can serve farmers in remote or linguistically diverse areas,
empowering them with knowledge and tools they might not otherwise access. Ultimately, this
project seeks to bridge the gap between cutting-edge AI and grassroots farming needs,
improving both productivity and sustainability.
1.4 Objectives
This project is motivated by the goal of creating a real-time plant disease detection system that
is practical, scalable, and easy to use in everyday agricultural settings. It focuses on achieving
several important objectives:
1.4.1 To study existing methodology and analyse the characteristics of datasets available
on different platforms of various plant disease:
This objective aims to study existing plant disease image datasets to understand their
structure, class distribution, quality, and limitations. The analysis will help identify the
specific requirements for training an efficient MobileNetV2 model for real-time plant
disease detection, focusing on aspects such as image resolution, disease variability, and
environmental conditions.
14
1.4.2 To create and preprocess a high-quality, balanced dataset of real-time plant disease:
This objective involves collecting, labeling, and preprocessing images showing both
healthy and diseased leaves from mango, potato, and eggplant plants. The dataset will be
tailored to meet the input requirements of the MobileNetV2 model, ensuring it is
lightweight, diverse, and suitable for deployment in real-world agricultural settings.
15
Chapter 2
Literature Survey
2.1 Introduction
Over the years, research has focused on developing techniques to automate plant disease
identification in order to reduce the dependency on manual inspection, relying on manual
inspection not only takes a lot of time but also leaves room for human mistakes. However, with
the advancement of deep learning and image recognition especially through Convolutional
Neural Networks (CNNs), there has been a major move toward automating the process of
detecting plant diseases.
U. Abhishek (2025) [5] conducted a study using various public datasets, including RGB,
multispectral, and hyperspectral images. The research involved advanced deep learning
techniques like Convolutional Neural Networks (CNN), Vision Transformers, and Generative
Adversarial Networks (GANs). The models performed impressively, with the CNN achieving
an accuracy of 99.35%, Inception-v3 scoring 92.60%, and Mask R-CNN getting 78.8%. The
future goal of this work is to develop real-time, AI-driven plant disease detection systems that
can be used effectively in agricultural environments. B. Ananthakrishnan (2025) [6] utilized a
dataset of 5,867 tea leaf images collected from Kaggle and GitHub. The study employed CNN
along with traditional machine learning techniques like OpenCV, Multi-Layer Perceptron
(MLP), Support Vector Machine (SVM), and Decision Trees. The combination of methods
yielded a strong accuracy of 95.06%. The future direction suggested is integrating these models
into mobile applications to help farmers detect diseases in the field more easily. D. Aria (2024)
[7] worked with various public plant disease datasets available on Kaggle. The methodology
involved a blend of CNN, Recurrent Neural Networks (RNN), GANs, and Transformers. The
Deep Convolutional Neural Network (DCNN) achieved 99.35% accuracy, while a simpler
smartphone-optimized model scored 65%. This work aims to improve AI models to support
16
real-time crop monitoring by integrating contextual data for better field applications. K.
Priyanka (2024) [8] focused specifically on rice leaf diseases using datasets sourced from
Kaggle. A CNN model was employed to identify the diseases, which yielded an accuracy of
95%. The researcher proposes future improvements in the model’s efficiency, particularly to
make it more applicable for real-world agricultural scenarios.
Bahaa S (2023) [9] used a large dataset consisting of 38 classes of both infected and healthy
crop leaves sourced from GitHub. The model used was EfficientNetV2S, a powerful neural
network known for balancing accuracy and speed. This model achieved an accuracy of 95.01%.
Future work involves extending the trained model to handle a wider variety of crops and
environmental conditions. J. S. Diana (2023) [10] developed separate datasets for rice, wheat,
and maize diseases using images from Google. The researcher used a combination of Xception,
MobileNet, and InceptionV3 models. The models performed well with accuracies of 95.80%
for maize, 97.28% for rice, and 96.32% for wheat. Future work involves applying these deep
learning models to other crops for broader agricultural application. M. Emmanuel (2023) [11]
utilized a robust dataset comprising 5,170 real-world field images and 8,629 annotated leaves
across 27 disease classes, sourced from Kaggle. Models like MobileNet, VGG16, InceptionV3,
YOLOv3, and others were implemented using transfer learning. The MobileNet model
achieved the highest accuracy of 99.69%, while EfficientNetV and DenseNet121 also
performed excellently. The study recommends expanding the dataset and developing UAV-
based and mobile systems for real-time disease detection.
K. Faiza (2023) [12] collected maize leaf images from the University Research Farm Koont
(PMAS-AAUR), focusing on three maize diseases. A series of deep learning models, including
YOLOv3-tiny to YOLOv8n, were used. The final models performed with a high accuracy of
99.04%. The researcher aims to integrate the detection model into a mobile application for real-
time monitoring and tracking in the field. K. Jameer (2023) [13] relied on multiple datasets
including the popular PlantVillage dataset, all sourced from GitHub. The study experimented
with various models such as SVM, Bayes Discriminant Function, and a custom MDC
algorithm. The Bayes function achieved 98.32% accuracy, and the MDC model achieved
95.71%. The goal is to implement these models into real-world agricultural systems and
potentially develop smartphone-based diagnostic tools.
C. Md. Jalal Uddin (2023) [14] compiled a large dataset of 17,430 images from Kaggle,
17
covering bell pepper, tomato, and potato leaves with 14 distinct disease classes. Multiple
machine learning models including CNN, KNN, Logistic Regression, SVM, and Decision Tree
were tested. CNN led with 99% accuracy, while others like SVM and Logistic Regression
lagged. The study emphasizes future expansion to include multi-seasonal, real-world image
variations and AI-powered treatment suggestions. M. Shoaib (2023) [15] used the extensive
PlantVillage dataset, which includes around 38,000 images across 14 different crops. This
work employed a variety of deep learning models including AlexNet, VGG16, VGG19,
ResNet, InceptionV3, DenseNet, YOLOv3, SSD, and advanced object detection frameworks
like Faster R-CNN and Mask R-CNN. The results were impressive, with a top accuracy of
99.97%. The study envisions applying the models in real-world agricultural systems and
creating smartphone-based diagnostic tools for farmers. S. B. Tej (2023) [16] used datasets
collected from UAV-based (drone) remote sensing of various crops. The models tested
included Faster R-CNN, YOLOv5, ResNet50 combined with SVM, and U-Net for
segmentation. These models achieved high accuracy, such as 97.86% with ResNet50 + SVM,
and 96% with YOLOv5. The study aims to develop automated agricultural monitoring systems,
especially using drones for early disease detection.
C. Jackulin (2022) [17] worked with images collected from actual agricultural fields to reflect
real-world conditions. The methodology included CNN along with traditional ML models like
SVM, Random Forest, Decision Tree, KNN, and Naive Bayes. The CNN-based models
achieved high accuracy, with the top result being 98.56%. The research points toward
expanding datasets with real-world images and AI-based treatment recommendations. J. Arun
Pandian (2022) [18] used a massive dataset of 147,500 images covering 58 classes of plant leaf
diseases. The models applied were 14-layered DCNNs and other CNN-based architectures,
achieving 99.43% accuracy. The future scope of the study is focused on model automation and
using AI for early-stage disease detection. P. Jagamohan (2022) [19] focused on rice leaf
diseases, using a dataset of 5,932 images. The study used CNN-based methods and obtained
an accuracy of 97.20%. The main goal is to deploy the model on edge devices and integrate
real-time use in agricultural fields. M. Rabbia (2022) [20] targeted potato leaf diseases using a
dataset of 2,152 images sourced from Kaggle and Google. The study applied basic CNN
models, achieving accuracies such as 99.6% with CNN, 97% with K-NN, and 88% with SVM.
The future scope involves adapting the model to human activity recognition and expanding it
to other domains.
18
H. Sunil S. (2022) [21] used tomato leaf images from Kaggle showing various disorders. The
model architecture used was CNN, and the best-performing model reached 94% accuracy. The
study aims to extend the disease detection system to other plant species and deploy a mobile-
based detection system. A. A. Ahmed (2021) [22] compiled a very large dataset of 96,206
images from Kaggle, PlantVillage, and Google. Models like VGG16, VGG19, InceptionV3,
and VGG19 with logistic regression were applied. The VGG19 + Logistic Regression approach
achieved the highest accuracy of 97.8%. Future plans include UAV-based deployment for
aerial crop monitoring. Kowshik B (2021) [23] collected plant leaf images manually using a
digital camera. The study used traditional machine learning models like Naive Bayes, Decision
Tree, KNN, SVM, and Random Forest. The best model, Random Forest, achieved 79.23%
accuracy. The focus is now on developing mobile apps and improving models using transfer
learning and other advanced techniques. L. Lili (2021) [24] used a combination of public
datasets such as PlantVillage and AI Challenger 2018. Deep learning models like ResNet-50,
DenseNet-121, GoogLeNet, and others were tested. The DenseNet-121 model yielded the
highest accuracy of 99.75%. The study plans to build open-access plant disease image
repositories and optimize training time while maintaining high performance.
S. Rahul (2021) [25] worked with two separate datasets: 5,932 images for rice leaves and 1,500
for potato leaves. The proposed CNN model achieved high accuracy 99.58% for rice and
97.66% for potato. The study emphasizes the model’s effectiveness and recommends
expanding its usage to other crop types and deploying it in mobile applications for real-time
use by farmers. P. Shanta (2021) [26] used multiple plant leaf disease datasets from Kaggle
and implemented deep learning models like AlexNet, SqueezeNet, and a custom CNN model.
The custom CNN reached 98% accuracy. The study also explored hybrid approaches such as
combining image segmentation with SVM or backpropagation neural networks, showing high
potential. The goal is to enhance classification and build real-time apps for farmers. T.
Divyansh (2020) [27] used 2,152 potato leaf images from the PlantVillage dataset. The models
evaluated include SVM, Naive Bayes, KNN, Decision Tree, and Random Forest. The best
result came from Random Forest with 79.23% accuracy. The study suggests improving dataset
diversity and building real-time farmer tools as future work. P. P. Kshyanaprava (2020) [28]
focused on maize leaf images for classification purposes. While the dataset was relatively
simple, it still contributed to understanding model effectiveness. A range of models were
applied including CNN, ResNet34, and transfer learning. The accuracy peaked at 96.21%. The
future aim is to deploy real-time monitoring using UAVs and expand functionality.
19
M. Omkar (2020) [29] used the comprehensive PlantVillage dataset with 54,444 images.
Models like YOLOv3 and ResNet18 were applied, achieving 96% accuracy with ResNet18.
The future plan includes extending the model to provide additional services like nearby market
listings for farmers.V. Aravindhan (2019) [30] used PlantVillage dataset from GitHub and
integrated it with real-world applications using deep learning models such as VGG, ResNet,
and SqueezeNet. High accuracy (up to 97.86% with ResNet50) was achieved. The researcher
proposes integrating transfer learning and hyperspectral imaging to boost early detection
capabilities. A. Kawcher (2019) [31] collected rice leaf images from agricultural research
centers. The models implemented included AlexNet, VGG16, ResNet, etc., and they achieved
strong results with accuracies reaching up to 99.19% with ResNet50. The study suggests using
nature-inspired algorithms to further enhance performance. T. Muammer (2019) [32] gathered
real-world plant images from agricultural fields in various regions of Turkey. The work
employed multiple CNN models and aimed at integrating AI for early disease detection. The
models achieved up to 98.59% accuracy, and the goal is to develop mobile-based detection
systems with high efficiency.
S. H. Muhammad (2019) [33] used both public and custom agricultural datasets. Techniques
like CNN, VGG, Inception, and ResNet were implemented. Results ranged from 76% to
99.84%, depending on the model. The future plan includes reducing complexity for IoT-based
smart farming systems. M. Sammy V. (2019) [34] created a large dataset of 35,000 images
covering various crops like apple, corn, and tomato. Using models such as Faster R-CNN, SSD,
and R-FCN, the study reached accuracies between 90% and 97%. The aim is to improve model
generalization and collect more diverse, real-world images. N. Sapna (2019) [35] worked with
PlantVillage and custom real-world datasets. Using CNN-based deep learning models such as
LeafNet and PlantDiseaseNet, the best performance was from DenseNet-121 with 99.75%
accuracy. Future work involves expanding datasets and field-based deployments.
This chapter explores various research studies and existing systems that aim to tackle plant
diseases, discusses the methodologies employed, and identifies the gaps that this project aims
to address. Based on the research done the survey is given below. Table 2.1 shows the research
work.
20
Table 2.1 - Literature Survey
21
K. Collected from YOLOv3-tiny, 99.04% Integration of the
Faiza(202 the University YOLOv4, model into a mobile
3) [12] Research Farm YOLOv5s, application for real-
Koont, PMAS- YOLOv7s, time disease
8 AAUR; includes and detection and
three maize YOLOv8n. tracking
diseases.
22
J. Arun 147,500 images 14-DCNN 14-DCNN: Applying the model
Pandian(2 covering 58 99.97% to other plant
022) [18] different plant species and
leaf classes optimizing it for
14
real-world
agricultural
applications.
23
S. Rahul Rice Leaf CNN CNN: Using nature-
(2021) Dataset: 5,932 98.65% inspired algorithms
[25] images to improve CNN
21 Potato Leaf model performance.
Dataset: 1,500
images
P. Shanta Multiple plant Convolutional CNN: 98% Integrating AI-
(2021) leaf disease Neural AlexNet: based early disease
[26] datasets from Network 95.65% prediction models
Kaggle (CNN), SqueezeNet
22 AlexNet and : 94.30%
SqueezeNet
for feature
extraction
T. PlantVillage VGG16, VGG16: Developing a
Divyansh Dataset 2152 VGG19, 92% Mobile-Based
(2020) images of potato InceptionV3 VGG19 Disease Detection
[27] leaves :97.8% System, Improving
23 InceptionV Dataset Quality and
3: 94.30% Optimizing
Computational
Efficiency
P. P. Images of maize Naive Bayes, Naive Enhancing
Kshyanap leaves used for Decision Tree, Bayes: classification
rava classification K-Nearest 77.46% accuracy and
(2020) Neighbor, Decision developing a real-
[28] Support Tree : time application for
Vector 74.35% farmers
Machine, and KNN:
24
Random 76.16%
Forest SVM:
77.56%
Random
Forest (RF)
: 79.23%
24
A. Rice leaf images Decision Tree Decision Extending the
Kawcher( dataset collected KNN, Logistic Tree: model to mobile
2019) from agricultural Regression, 94.91% applications for
[31] research centers Naïve Bayes KNN: farmers,
98.84% Implementing
Logistic CNN-based models
Regression for automated
27
75.46% feature extraction.
Naïve
Bayes
58.80%
25
J. PlantVillage CNN CNN: Extending the
Ashwin(2 dataset with four 98.59% system to detect
018) [36] disease categories more plant diseases
(Bacterial Spot, and real-time
Yellow Leaf Curl implementation for
32
Virus, Late farmers
Blight, and
Healthy Leaf)
26
Despite high model accuracies, few models are optimized or deployed in real-time on
mobile or edge devices. There is a practical gap in making these systems accessible and
usable for farmers in the field.
The dataset consists of 21 distinct classes encompassing a range of plant diseases and healthy
leaf conditions across three major crops: mango, potato, and eggplant. The images were
captured in natural lighting using a 48-megapixel rear camera on an iPhone, directly from rural
agricultural fields. This real-world data collection ensures that the images reflect authentic
environmental conditions, including variations in lighting, background, and leaf orientation,
making the dataset valuable for training robust and practical plant disease detection models.
Unlike many existing datasets that rely on controlled environments or online sources, this
dataset emphasizes field-level diversity and naturally occurring features, enhancing the
generalizability of the trained model. Each class contains a balanced distribution of healthy
and infected samples, aiding in the model’s ability to learn disease-specific patterns.
Additionally, care was taken to avoid duplicate or overly similar images to maintain dataset
integrity. The dataset underwent manual annotation and verification by domain experts to
ensure labeling accuracy.
According to Table 2.2, the dataset includes pictures of three different plant species: potatoes,
mangos, and eggplant. Images of both healthy and diseased leaves are included in each plant
category to represent a range of common plant health conditions. Total all images are 7983 in
the dataset. Classifications of eggplant data include pest, blight, wilt, and healthy. Mango
contains both healthy samples and a wider variety of illnesses, such as bacterial and fungal
infections. With a broad range of diseases brought on by bacteria, viruses, fungi, and pests,
potatoes have the most classes. This varied dataset is a useful tool for developing machine
learning models for the identification and categorisation of plant diseases.
27
Table 2.2 - Dataset Images Classification
Figure 2.1 shows leaf samples grouped by plant category and how they look under different
conditions. The conditions presented for potato leaves are Potato Pest, Potato Healthy, Potato
Nematode, Potato Virus, Potato Early Blight, Potato Phytophthora, Potato Late Blight, Potato
Bacteria and Potato Pest. Then there are also eggplant leaves included, with Eggplant Healthy,
Eggplant Pest, Eggplant Leaf Spot Blight and Eggplant Wilt. When picking mango leaves,
conditions including Mango Die Back, Mango Bacterial Canker, Mango Leaf Blight, Mango
Scab, Mango Powdery Mildew, Mango Grey Blight, Mango Anthracnose and Mango Healthy.
28
Figure 2.1 - Sample leaves Dataset used
2.3.1 Potato Virus:
Potato Virus infection is a common and serious condition that can be caused by a variety
of viruses, including Potato Virus Y (PVY), Potato Virus X (PVX), and Potato Leafroll
Virus (PLRV). These viruses primarily infect plants through the use of infected seed
tubers and are transmitted by vectors such as aphids, especially Myzus persicae (green
peach aphid). Once inside the plant, the virus replicates in plant cells and spreads
through plasmodesmata to other parts of the plant, disrupting normal physiological
functions.
33
2.3.18 Mango Powdery Mildew:
Powdery mildew on mango, caused by Oidium mangiferae, is a widespread and
economically significant disease that affects leaves, panicles, and young fruits. It is
characterized by the presence of a white, powdery fungal growth on the surface of
leaves and inflorescences. This growth consists of fungal mycelium and spores which
give the plant parts a dusty appearance. Infected leaves may curl, dry, and fall off. When
panicles are infected, flower drop occurs, leading to poor fruit set and low yields.
34
Chapter 3
Proposed Methodology
3.1 Introduction
This project focuses on creating a real-time system that uses deep learning to identify plant
diseases from images.
Developed with the farmers and agricultural professionals in mind, the system provides quick
yet accurate diagnoses that can easily be obtained through a web browser. This means it is
possible to react quickly to early signs of an outbreak and thus reduce the impact it will have
on the crops being grown. The engineered Press model is built upon MobileNetV2, which is a
light-weight architecture which fits well on-powerful devices.
The model has been trained on a set of diseases affecting three major crops namely sugarcane,
cassava and banana that are grouped in; mango, potato, and eggplant. This method
encompasses data and image acquisition and processing, model training and testing, as well as
the deployment of a Streamlit web application to host the trained algorithm.
The whole system is designed to be fast, easily scalable, and usable even in areas that have
marginal internet connection. Thus, the presented methodology of leveraging AI to address
real-life agricultural issues can be a strong basis for smart farming and precision agriculture.
36
Figure 3.1 - Proposed Work flowchart
38
Figure 3.2 – MobileNetV2 Architecture
The table 3.1 outlines the architecture of a lightweight Convolutional Neural Network,
resembling MobileNetV2. It is optimized for mobile and embedded vision applications.
The model learns features in a hierarchical manner: from basic edges and textures in early
layers to complex patterns like disease shapes and spot arrangements in later layers. After
passing through the full architecture, the final softmax layer outputs probabilities for each leaf
class (e.g., healthy, bacterial spot, early blight, etc.).
39
3.3.6 Training of Data:
This phase involves feeding the preprocessed real-time images into MobileNetV2. During
training:
• The model adjusted its internal weights to minimize prediction error using
backpropagation.
• Loss functions (like categorical cross-entropy) were used to measure how far predictions
were from actual labels.
• Optimization algorithms (such as Adam) helped fine-tune these weights efficiently.
• Multiple epochs (iterations over the dataset) were conducted to ensure the model
understood deep patterns in the leaf images.
The model essentially learned to distinguish between healthy and diseased leaves, and among
different disease types, based solely on the real-time images you collected, making it highly
realistic and practical for real-world use.
3.3.7 Optimization:
After initial training, further optimization steps were carried out to prepare the model for real
deployment. These included:
• Quantization: Converting model weights to 8-bit integers to reduce size.
• Pruning: Removing redundant parameters to improve speed.
• Tuning hyperparameters such as learning rate and dropout to boost performance.
3.3.8 Deployment:
With our trained and optimized MobileNetV2 model, we’re now able to deploy it as a real-
world solution. Because our model is based on data captured by phone, it works well in the
same conditions it was trained for, giving it a strong edge over generic models trained on
curated datasets.
3.3.9 Feedback:
Finally, once the model is deployed, a feedback mechanism can be added. If the Web app or
system predicts wrongly, users can report or correct the classification. This feedback is
valuable because:
• It helps collect new disease types or unseen conditions.
• Ensures the model remains accurate and relevant over time.
40
Chapter 4
Implementation
4.1 Introduction
In this we describes how the data was collected, preprocessed, and used to train a deep learning
model, followed by deployment on a user-accessible platform. The core of the implementation
lies in leveraging the MobileNetV2 architecture for its efficiency and accuracy, combined with
the simplicity and accessibility of Streamlit for real-time web deployment.
The complete workflow from gathering field images to delivering real-time predictions has
been designed to ensure ease of use, speed, and reliability, especially catering to the needs of
farmers and agricultural experts.
41
Figure 4.1 – Implementation Flowchart
42
4.2.4 Model Selection and Training:
Use MobileNetV2 as the base model, leveraging its pre-trained ImageNet weights for
transfer learning. Customize the classification head by adding:
• GlobalAveragePooling2D
• Fully connected Dense layer with ReLU activation
• Dropout for regularization
• Output Dense layer with Softmax activation for multi-class classification (21
classes)
43
● GPU (optional) – NVDIA GPU with CUDA
● Mobile Camera – Any good quality mobile camera.
44
Chapter 5
Result & Discussion
45
5.2.2 Confusion Matrix Analysis
Performance of individual classes was evaluated by using a multiclass confusion matrix
approach. This showed that although features such as Potato Healthy and Mango
Powdery Mildew were classified with high accuracy, the ones like Eggplant Wilt and
Mango Anthracnose had comparatively higher misclassification rates. Some of these
confusions were aggravated by the fact that at times the symptoms may be very similar;
the disease might manifest in the form of leaf spots or wilting.
Figure 5.2 presents the confusion matrix, which highlights how well the plant disease
classification model performed, achieving an overall accuracy of 74%. The matrix
represents four categories: true positives (bottom-right), true negatives (top-left), false
positives (bottom-left), and false negatives (top-right). In this case, the model correctly
classified 370 diseased leaf images as diseased and 370 healthy leaf images as healthy.
However, it also misclassified 130 healthy leaves as diseased and 130 diseased leaves as
healthy.
This performance suggests that the model can generally tell the difference between
healthy and diseased samples, but there's still some confusion in a few cases, possibly
due to visual similarities between classes or limited image variety. Nevertheless, the
balanced confusion matrix suggests that the model is not biased toward any one class,
and further improvements can be achieved through dataset expansion or fine-tuning the
model architecture.
46
5.2.3 Precision, Recall, and F1-Score
The following average metrics were recorded across all classes:
• Precision: 0.75
• Recall: 0.73
• F1-Score: 0.74
These values indicate a fairly balanced model, with no significant overfitting or
underfitting on any particular class.
5.3 Visual Results and Web Application Output
5.3.1 Interface Overview
• Home Page: Describes the app’s purpose, usage instructions, and supported crops.
5.4 Discussion
The development and testing of the proposed plant disease identification system have yielded
insightful outcomes that reflect both the promise and the limitations of machine learning (ML)
in real-world agricultural applications. With a final model accuracy of 74.39%, achieved using
MobileNetV2 on a custom, real-time dataset, this project presents a significant stride in
bridging the gap between theoretical deep learning research and practical deployment for
farmers. This section delves deeply into the interpretation of results, real-world relevance,
challenges, and future opportunities based on experimental findings.
Our dataset, gathered through smartphones in actual field conditions, contains images
with varied lighting, backgrounds, and angles simulating the way farmers would capture
images. Thus it resulting 74.39% accuracy.
48
5.4.2 Environmental Noise
The model’s performance was impacted by natural noise in field photography. This
includes:
• Inconsistent lighting (e.g., shadowed vs. sunlit leaves)
• Unclear backgrounds (soil, weeds, debris)
• Partial leaves or occlusion (only half the leaf visible)
• Blurred images due to motion or camera shake
These artifacts affected feature extraction, as MobileNetV2 depends heavily on edge
patterns and texture granularity to classify. Yet, this reinforces the project's core
principle: building for reality, not ideal lab conditions.
5.4.4 Deployment
One of the project’s biggest wins is its deploy ability:
• The trained model size was reduced to under 14MB, making it viable for mobile
devices.
• Inference time was under 2 seconds per image on mid-tier smartphones.
• The model is integrated into a Streamlit web app with a simple interface,
multilingual support, and actionable output.
Compared to complex models like ResNet50 or InceptionV3, which require GPU
resources and extensive RAM, this system can be used in rural, low-bandwidth, and
offline settings, fulfilling its intended use case.
49
Chapter 6
Conclusion & Future Scope
6.1 Conclusion
This project showcases a practical approach to detecting plant diseases in real time using
images and the MobileNetV2 deep learning model. It centers on four important crops mango,
potato, spinach, and eggplant and is trained to identify 21 different categories of healthy and
diseased leaves. What makes this work stand out is the use of a custom dataset, carefully
collected with a smartphone directly from actual farms, making the system more applicable
and reliable in real-world agricultural settings.
MobileNetV2 model with a light model size of 3.5 MB tested the model with an accuracy of
74.39% it seems to be a quite good performance for solving multi-class classification problems
with moderate computation burden. The model was implemented as a Streamlit web
application where users can input pictures and get a report of the diseases in them along with
voice and text recommendations for treating them. This not only makes the system technically
sound and realistic, but also feasible to farmers with limited resources in rural areas.
Overall, the project bridges the gap between machine learning technology and the agricultural
sector, offering a scalable, cost-effective solution for early disease detection and better crop
management
While the current system lays a strong foundation, there are several opportunities to extend and
improve its functionality:
6.2.1 Expansion to More Crops and Diseases:
The model can be extended to cover additional crops and a wider variety of diseases,
increasing its relevance and utility across different regions.
50
6.2.2 Integration with Mobile Applications:
Creating a dedicated Android/iOS app would allow offline usage, enabling farmers in
remote areas with limited internet connectivity to benefit from the system.
6.2.3 Real-Time Camera Capture and Diagnosis:
Incorporating live image capture using a mobile device camera for direct, on-spot
diagnosis would enhance usability and convenience.
6.2.4 Multi-Language and Regional Voice Support:
Expanding voice output to include more regional languages and dialects can further
improve accessibility for diverse user groups.
6.2.5 Feedback Loop and Retraining:
Implementing a system where users can give feedback on predictions will allow the
model to improve over time through retraining on new data.
6.2.6 Cloud-Based Monitoring Dashboard:
A centralized dashboard for agricultural officers to monitor disease spread trends
across regions can be developed to aid in larger-scale agricultural planning and
disease control.
51
References
1. Food and Agriculture Organization of the United Nations (FAO). (2021). The State of Food
and Agriculture.
2. Strange, R. N., & Scott, P. R. (2005). Plant disease: a threat to global food security.
Philosophical Transactions of the Royal Society B: Biological Sciences, 360(1464), 1251–
1262.
3. Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., & Chen, L. C. (2018). MobileNetV2:
Inverted residuals and linear bottlenecks. In Proceedings of the IEEE Conference on
Computer Vision and Pattern Recognition (CVPR), 4510–4520.
4. Too, E. C., Yujian, L., Njuki, S., & Yingchun, L. (2019). A comparative study of fine-tuning
deep learning models for plant disease identification. Computers and Electronics in
Agriculture, 161, 272–279.
5. Upadhyay, A., Chandel, N. S., Singh, K. P., Chakraborty, S. K., Nandede, B. M., Kumar, M.,
& Elbeltagi, A. (2025). Deep learning and computer vision in plant disease detection: a
comprehensive review of techniques, models, and trends in precision agriculture. Artificial
Intelligence Review, 58(3), 1-64.
6. Balasundaram, A., Sundaresan, P., Bhavsar, A., Mattu, M., Kavitha, M. S., & Shaik, A.
(2025). Tea leaf disease detection using segment anything model and deep convolutional
neural networks. Results in Engineering, 25, 103784..
7. Dolatabadian, A., Neik, T. X., Danilevicz, M. F., Upadhyaya, S. R., Batley, J., & Edwards,
D. (2025). Image‐based crop disease detection using machine learning. Plant Pathology,
74(1), 18-38.
8. Kulkarni, P., & Shastri, S. (2024). Rice leaf diseases detection using machine learning.
Journal of Scientific Research and Technology, 17-22.
9. Hamed, B. S., Hussein, M. M., & Mousa, A. M. (2023). Plant Disease Detection Using Deep
Learning. Int. J. Intell. Syst. Appl, 15, 38-50.
10. Joseph, D. S., Pawar, P. M., & Chakradeo, K. (2024). Real-time plant disease dataset
development and detection of plant disease using deep learning. Ieee Access, 12, 16310-
16333.
11. Moupojou, E., Tagne, A., Retraint, F., Tadonkemwa, A., Wilfried, D., Tapamo, H., &
Nkenlifack, M. (2023). FieldPlant: A dataset of field plant images for plant disease detection
and classification with deep learning. IEEE Access, 11, 35398-35410.
52
12. Khan, F., Zafar, N., Tahir, M. N., Aqib, M., Waheed, H., & Haroon, Z. (2023). A mobile-
based system for maize plant leaf disease detection and classification using deep learning.
Frontiers in Plant Science, 14, 1079366.
13. Kotwal, J., Kashyap, R., & Pathan, S. (2023). Agricultural plant diseases identification: From
traditional approach to deep learning. Materials Today: Proceedings, 80, 344-356.
14. C. Md. Jalaluddin & Al-Tuwaijari, J. M. (2023, April). Plant leaf diseases detection and
classification using image processing and deep learning techniques. In 2020 International
Conference on Computer Science and Software Engineering (CSASE) (pp. 259-265). IEEE.
15. Shoaib, M., Shah, B., Ei-Sappagh, S., Ali, A., Ullah, A., Alenezi, F., ... & Ali, F. (2023). An
advanced deep learning models-based plant disease detection: A review of recent research.
Frontiers in Plant Science, 14, 1158933.
16. Shahi, T. B., Xu, C. Y., Neupane, A., & Guo, W. (2023). Recent advances in crop disease
detection using UAV and deep learning techniques. Remote Sensing, 15(9), 2450.
17. Jackulin, C., & Murugavalli, S. J. M. S. (2022). A comprehensive review on detection of
plant disease using machine learning and deep learning approaches. Measurement: Sensors,
24, 100441.
18. Pandian, J. A., Kumar, V. D., Geman, O., Hnatiuc, M., Arif, M., & Kanchanadevi, K. (2022).
Plant disease detection using deep convolutional neural network. Applied Sciences, 12(14),
6982.
19. Padhi, J., Mishra, K., Ratha, A. K., Behera, S. K., Sethy, P. K., & Nanthaamornphong, A.
(2025). Enhancing Paddy Leaf Disease Diagnosis-a Hybrid CNN Model using Simulated
Thermal Imaging. Smart Agricultural Technology, 100814.
20. Mahum, R., Munir, H., Mughal, Z. U. N., Awais, M., Sher Khan, F., Saqlain, M., ... & Tlili,
I. (2023). A novel framework for potato leaf disease detection using an efficient deep learning
model. Human and Ecological Risk Assessment: An International Journal, 29(2), 303-326.
21. Harakannanavar, S. S., Rudagi, J. M., Puranikmath, V. I., Siddiqua, A., & Pramodhini, R.
(2022). Plant leaf disease detection using computer vision and machine learning algorithms.
Global Transitions Proceedings, 3(1), 305-310
22. Ahmed, A. A., & Reddy, G. H. (2021). A mobile-based system for detecting plant leaf
diseases using deep learning. AgriEngineering, 3(3), 478-493.
23. Kowshik, B., Savitha, V., Karpagam, G., & Sangeetha, K. (2021). Plant disease detection
using deep learning. International Research Journal on Advanced Science Hub, 3(3S), 30-33.
24. Li, L., Zhang, S., & Wang, B. (2021). Plant disease detection and classification by deep
learning a review. IEEE Access, 9, 56683-56698.
53
25. Sharma, R., Singh, A., Jhanjhi, N. Z., Masud, M., Jaha, E. S., & Verma, S. (2022). Plant
Disease Diagnosis and Image Classification Using Deep Learning. Computers, Materials &
Continua, 71(2).
26. S. Prabhakar (2021). Plant disease identification using Deep Learning: A review. The Indian
Journal of Agricultural Sciences, 90(2), 249-257.
27. Tiwari, D., Ashish, M., Gangwar, N., Sharma, A., Patel, S., & Bhardwaj, S. (2020, May).
Potato leaf diseases detection using deep learning. In 2020 4th international conference on
intelligent computing and control systems (ICICCS) (pp. 461-466). IEEE.
28. Panigrahi, K. P., Das, H., Sahoo, A. K., & Moharana, S. C. (2020). Maize leaf disease
detection and classification using machine learning algorithms. In Progress in Computing,
Analytics and Networking: Proceedings of ICCAN 2019 (pp. 659-669). Springer Singapore.
29. Mindhe, O., Kurkute, O., Naxikar, S., & Raje, N. (2020). Plant disease detection using deep
learning. International Research Journal of Engineering and Technology, 2497-2503.
30. Venkataramanan, A., Honakeri, D. K. P., & Agarwal, P. (2019). Plant disease detection and
classification using deep neural networks. Int. J. Comput. Sci. Eng, 11(9), 40-46
31. Ahmed, K., Shahidi, T. R., Alam, S. M. I., & Momen, S. (2019, December). Rice leaf disease
detection using machine learning techniques. In 2019 International Conference on
Sustainable Technologies for Industry 4.0 (STI) (pp. 1-5). IEEE.
32. Türkoğlu, M., & Hanbay, D. (2019). Plant disease and pest detection using deep learning-
based features. Turkish Journal of Electrical Engineering and Computer Sciences, 27(3),
1636-1651.
33. Saleem, M. H., Potgieter, J., & Arif, K. M. (2019). Plant disease detection and classification
by deep learning. Plants, 8(11), 468.
34. Militante, S. V., Gerardo, B. D., & Dionisio, N. V. (2019, October). Plant leaf detection and
disease recognition using deep learning. In 2019 IEEE Eurasia conference on IOT,
communication and engineering (ECICE) (pp. 579-582). IEEE.
35. Nigam, S., & Jain, R. (2020). Plant disease identification using Deep Learning: A review.
The Indian Journal of Agricultural Sciences, 90(2), 249-257
36. Dhakal, A., & Shakya, S. (2018). Image-based plant disease detection with deep learning.
International Journal of Computer Trends and Technology, 61(1), 26-29.
37. Barbedo, J. G. (2018). Factors influencing the use of deep learning for plant disease
recognition. Biosystems engineering, 172, 84-91.
54
38. Akila, M., & Deepan, P. (2018). Detection and classification of plant leaf diseases by using
deep learning algorithm. International Journal of Engineering Research & Technology
(IJERT), 6(7), 1-5.
55
major final report - ai plag [Link]
ORIGINALITY REPORT
12 %
SIMILARITY INDEX
7%
INTERNET SOURCES
6%
PUBLICATIONS
5%
STUDENT PAPERS
PRIMARY SOURCES
1
Submitted to GL Bajaj Institute of Technology
and Management
4%
Student Paper
2
M.K. Rana. "Vegetable Crops Science", CRC
Press, 2017
1%
Publication
3
"Handbook of Florists' Crops Diseases",
Springer Science and Business Media LLC,
<1%
2018
Publication
4
[Link]
Internet Source <1%
5
Amar Bahadur, Pranab Dutta. "Diseases of Oil
Crops and Their Integrated Management",
<1%
CRC Press, 2023
Publication
6
Wubetu Barud Demilie. "Plant disease
detection and classification techniques: a
<1%
comparative study of the performances",
Journal of Big Data, 2024
Publication
7
Diseases of Fruits and Vegetables Volume I,
2004.
<1%
Publication
8
Arvind Dagur, Karan Singh, Pawan Singh
Mehra, Dhirendra Kumar Shukla. "Intelligent
<1%
Computing and Communication Techniques -
Volume 3", CRC Press, 2025
Publication
9
[Link]
Internet Source <1%
10
[Link]
Internet Source <1%
11
Field Crop Diseases Handbook, 1989.
Publication <1%
12
Lluís Palou, Joseph L. Smilanick. "Postharvest
Pathology of Fresh Horticultural Produce",
<1%
Routledge, 2019
Publication
13
"The Sweetpotato", Springer Science and
Business Media LLC, 2009
<1%
Publication
14
Faiza Khan, Noureen Zafar, Muhammad
Naveed Tahir, Muhammad Aqib, Hamna
<1%
Waheed, Zainab Haroon. "A mobile-based
system for maize plant leaf disease detection
and classification using deep learning",
Frontiers in Plant Science, 2023
Publication
15
[Link]
Internet Source <1%
16
Gireesh Chand, Nadeem Akhtar, Santosh
Kumar. "Diseases of Fruits and Vegetable
<1%
Crops - Recent Management Approaches",
CRC Press, 2020
Publication
17
[Link]
Internet Source <1%
18
[Link]
Internet Source <1%
19
Mohammed A. Asham, Asma A. Al-Shargabi,
Raeed Al-Sabri, Ibrahim Meftah. "A lightweight
<1%
deep learning model with knowledge
distillation for pulmonary diseases detection
in chest X-rays", Multimedia Tools and
Applications, 2024
Publication
20
Submitted to University of Exeter
Student Paper <1%
21
"Genomic Designing for Biotic Stress Resistant
Vegetable Crops", Springer Science and
<1%
Business Media LLC, 2022
Publication
22
Submitted to The Robert Gordon University
Student Paper <1%
23
Submitted to Middlesex University
Student Paper <1%
24
Rabbia Mahum, Haris Munir, Zaib-Un-Nisa
Mughal, Muhammad Awais et al. "A novel
<1%
framework for potato leaf disease detection
using an efficient deep learning model",
Human and Ecological Risk Assessment: An
International Journal, 2022
Publication
25
Submitted to University of Sheffield
Student Paper <1%
26
Submitted to Liverpool John Moores
University
<1%
Student Paper
27
Submitted to Southern New Hampshire
University - Continuing Education
<1%
Student Paper
28
V Suma, R Amog Shetty, Rishab F Tated,
Sunku Rohan, Triveni S Pujar. "CNN based
<1%
Leaf Disease Identification and Remedy
Recommendation System", 2019 3rd
International conference on Electronics,
Communication and Aerospace Technology
(ICECA), 2019
Publication
29
[Link]
Internet Source <1%
30
[Link]
Internet Source <1%
31
[Link]
Internet Source <1%
32
Submitted to VIT University
Student Paper <1%
33
[Link]
Internet Source <1%
34
[Link]
Internet Source <1%
35
Dinesh Singh, Ram Roshan Sharma, V.
Devappa, Deeba Kamil. "Postharvest Handling
<1%
and Diseases of Horticultural Produce", CRC
Press, 2021
Publication
36
[Link]
Internet Source <1%
37
[Link]
Internet Source <1%
38
[Link]
Internet Source <1%
39
[Link]
Internet Source <1%
40
[Link]
Internet Source <1%
41
[Link]
Internet Source <1%
42
Muhammad Sarwar Khan, Iqrar Ahmad Khan,
Debmalya Barh. "Applied Molecular
<1%
Biotechnology - The Next Generation of
Genetic Engineering", CRC Press, 2019
Publication
43
[Link]
Internet Source <1%
44
[Link]
Internet Source <1%
45
[Link]
Internet Source <1%
46
[Link]
Internet Source <1%
47
[Link]
Internet Source <1%
48
[Link]
Internet Source <1%
49
[Link]
Internet Source <1%
50
Kanlayanee Kaweesinsakul, Siranee
Nuchitprasitchai, Joshua Pearce. "Open
<1%
source disease analysis system of cactus by
artificial intelligence and image processing",
The 12th International Conference on
Advances in Information Technology, 2021
Publication
51
Lavika Goel, Jyoti Nagpal. "A Systematic
Review of Recent Machine Learning
<1%
Techniques for Plant Disease Identification
and Classification", IETE Technical Review,
2022
Publication
52
P.C. Struik, S.G. Wiersema. "Seed potato
technology", Brill, 1999
<1%
Publication
53
Rajasekaran Thangaraj, S. Anandamurugan, P
Pandiyan, Vishnu Kumar Kaliappan. "Artificial
<1%
intelligence in tomato leaf disease detection:
a comprehensive review and discussion",
Journal of Plant Diseases and Protection, 2021
Publication
54
Zeynep Ünal, Hakan Aktaş. "Classification of
hazelnut kernels with deep learning",
<1%
Postharvest Biology and Technology, 2023
Publication
55
[Link]
Internet Source <1%
56
[Link]
Internet Source <1%
57
[Link]
Internet Source <1%
58
R. K. Singh, Gopala. "Innovative Approaches in
Diagnosis and Management of Crop Diseases
<1%
- Field and Horticultural Crops", Apple
Academic Press, 2021
Publication
59
Shalli Rani, Ayush Dogra, Ashu Taneja. "Smart
Computing and Communication for
<1%
Sustainable Convergence", CRC Press, 2025
Publication
60
M. J. Foxe. "Breeding for viral resistance:
conventional methods", Netherlands Journal
<1%
of Plant Pathology, 1992
Publication