0% found this document useful (0 votes)
42 views61 pages

Real-Time Plant Disease Detection ML

The project report presents a real-time plant disease identification system using Machine Learning, specifically MobileNetV2, to assist farmers in detecting diseases in mango, potato, and eggplant leaves. The system achieves a classification accuracy of 74.39% and is deployed as a web application that allows users to upload leaf images for instant disease predictions and treatment suggestions. This initiative aims to improve crop management and reduce economic losses caused by plant diseases, especially in regions with limited access to agricultural expertise.

Uploaded by

LAKSHYA BHATI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views61 pages

Real-Time Plant Disease Detection ML

The project report presents a real-time plant disease identification system using Machine Learning, specifically MobileNetV2, to assist farmers in detecting diseases in mango, potato, and eggplant leaves. The system achieves a classification accuracy of 74.39% and is deployed as a web application that allows users to upload leaf images for instant disease predictions and treatment suggestions. This initiative aims to improve crop management and reduce economic losses caused by plant diseases, especially in regions with limited access to agricultural expertise.

Uploaded by

LAKSHYA BHATI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Real Time Disease Identification in

Plants using Machine Learning

A Project Report
Submitted
In Partial Fulfillment of the Requirements
For the Degree of

Bachelor of Technology (B. Tech)


in
Computer Science & Engineering
by

Aditi Dutt Aashi Chaudhary


2101920100017 2101920100004

Anshika Goyal Lakshya Bhati


2101920100058 2101920100162

Under the Supervision of


Ms. Abha Kaushik
Assistant Professor

G.L. BAJAJ INSTITUTE OF TECHNOLOGY & MANAGEMENT


GREATER NOIDA

DR. A. P J ABDUL KALAM TECHNICAL UNIVERSITY,


UTTAR PRADESH, LUCKNOW
2024-25
Declaration

We hereby declare that the project work presented in this report entitled “Real Time Disease
Identification in Plants Using Machine Learning”, in partial fulfillment of the
requirement for the award of the degree of Bachelor of Technology in Computer Science
& Engineering, submitted to Dr. A.P.J. Abdul Kalam Technical University, Uttar Pradesh,
Lucknow is based on our own work carried out at Department of Computer Science &
Engineering, G.L. Bajaj Institute of Technology & Management, Greater Noida. The work
contained in the report is true and original to the best of our knowledge and project work
reported in this report has not been submitted by us for award of any other degree or diploma.

Signature:
Name: Aditi Dutt

Roll No: 2101920100017

Signature:
Name: Anshika Goyal

Roll No: 2101920100058

Signature:
Name: Aashi Chaudhary

Roll No: 2101920100004

Signature:
Name: Lakshya Bhati

Roll No: 2101920100162

Date:
Place: Greater Noida

ii
Certificate

This is to certify that the Project report entitled “Real Time Disease Identification in

Plants Using Machine Learning” done by Aditi Dutt (2101920100017), Anshika Goyal

(2101920100058), Aashi Chaudhary (2101920100004) and Lakshya Bhati

(2101920100162) is an original work carried out by them in Department of Computer

Science & Engineering, G.L. Bajaj Institute of Technology & Management, Greater Noida

under my guidance. The matter embodied in this project work has not been submitted earlier

for the award of any degree or diploma to the best of my knowledge and belief.

Date:

Ms. Abha Kaushik Dr. Sansar Singh Chauhan


Signature of the Supervisor Head of the Department

iii
Acknowledgement

The merciful guidance bestowed to us by the almighty made us stick out this project to a
successful end. We humbly pray with sincere heart for his guidance to continue forever.

We pay thanks to our project guide Ms. Abha Kaushik who has given guidance and light to
us during this project. Her versatile knowledge has helped us in the critical times during the span
of this project.

We pay special thanks to our Head of Department Dr. Sansar Singh Chauhan who has been
always present as a support and help us in all possible way during this project.

We also take this opportunity to express our gratitude to all those people who have been directly
and indirectly with us during the completion of the project.

We want to thanks our friends who have always encouraged us during this project.

At the last but not least thanks to all the faculty of CSE department who provided valuable
suggestions during the period of project

iv
Abstract

Plant diseases are a major threat to global agriculture, leading to reduced crop yields and
economic instability especially in regions where farming is the primary livelihood. Timely and
accurate disease detection is essential for minimizing these losses. This project proposes a real-
time plant disease identification system using MobileNetV2, a lightweight convolutional neural
network designed for efficient deployment on web and mobile platforms. A total of 21 classes
were created from images of mango, potato, and eggplant leaves, representing both healthy and
diseased conditions, including biotic (caused by pathogens) and abiotic (caused by
environmental stress) disorders. The dataset was pre processed with image resizing,
normalization, and augmentation to enhance variability and model robustness. Using
MobileNetV2 with transfer learning, the model was trained to classify these conditions and
achieved an accuracy of 74.39%. While not state-of-the-art, this performance demonstrates a
meaningful step toward real-time field deployment on low-resource devices. The system is
deployed as an interactive Streamlit web application that allows users to upload images of plant
leaves, receive instant disease predictions, and access relevant treatment suggestions. By
integrating deep learning with an accessible web interface, this project offers a scalable solution
to assist farmers in early detection of plant diseases, enabling more efficient crop management
and contributing to the advancement of smart agriculture
.

v
TABLE OF CONTENT

Declaration …………………………………………………………………………… (ii)


Certificate …………………………………………………………………………… (iii)
Acknowledgement………………………………………………………………………. (iv)
Abstract ………………………………………………………………………….. (v)
Table of Content ……………………………………………………………………….. (vi)
List of Figures ………………………………………………………………………….. (vii)
List of Tables ……………………………………………………………………………. (viii)

Chapter 1. Introduction…………………………………………………………… 9 - 15
1.1 Preliminaries.......................................................................................... 9
1.2 Problem Statement ……………………………………………………. 13
1.3 Motivation …………………………………......................................... 13
1.4 Objectives …………………………………………………………….. 14
Chapter 2 Literature Survey................................................................................ 16 – 34
2.1 Introduction ………………………………………………………….. 16
2.2 Research Gap ......................................................................................... 26
2.3 Dataset Collection ………….................................................................. 27
Chapter 3. Proposed Methodology………………………………………………. 35 – 40
3.1 Introduction …………………………………………………………… 35
3.2 Problem Formulation …………………………………………………. 35
3.3 Proposed Work ……………………………………………………… 36
Chapter 4. Implementation…................................................................................. 41 – 44
4.1 Introduction ……………………………………………………………. 41
4.2 Implementation Strategy (Flowchart, Algorithm etc.) …………………. 41
4.3 Tools/Hardware/Software Requirements..……………………………… 43
4.4 Expected Outcome …………………………………….………………. 44
Chapter 5. Result & Discussion ……………………….......................................... 45 – 49
5.1 Result Overview ………………………………………………………… 45
5.2 Model Performance Evaluation ……………….……………………… 45
5.3 Visual Results and Web Application output …………………………… 47
5.4 Discussion ………………………………………………………… 48
Chapter 6. Conclusion & Future Scope.………………………............................ 50 - 51
6.1 Conclusion ……………………………………………………………. 50
6.2 Future Scope ……………………………………………………………. 50
References …………………………………………………………………………… 52 – 55

vi
LIST OF FIGURES

Figure No. Description Page No.


Figure 1.1 Categories Of Plant Disease 10
Figure 2.1 Sample leaves Dataset used 29
Figure 3.1 Proposed Work Flowchart 37
Figure 3.2 MobileNet V2 Architecture 39
Figure 4.1 Implementation Flowchart 42
Figure 5.1 Accuracy Graph 45
Figure 5.2 Confusion Matrix 46
Figure 5.3 Home Page Output 47
Figure 5.4 About Page Output 47
Figure 5.5 Detect Page Output 48

vii
LIST OF TABLES

Table No. Description Page No.


Table 1.1 Biotic Plant Disease 11
Table 1.2 Abiotic Plant Disease 12
Table 2.1 Literature Survey 21
Table 2.2 Dataset Images Classification 28
Table 3.1 Each Layer of MobileNet V2 39

viii
Chapter 1
Introduction

1.1 Preliminaries

Agriculture plays a vital role in supporting economies around the world, essential not only for
ensuring food security but also for supplying raw materials to various industries[1]. It supports
the livelihood of billions and provides critical resources such as food crops, fibers, and other
plant-based products. However, a persistent challenge faced by the agricultural sector is the
prevalence of plant diseases. Plant diseases can significantly impact crop production, leading
to reduced food supplies and economic hardship especially in areas where farming is the main
source of livelihood.[2]

Traditionally, identifying plant diseases has mostly been a manual process, relying on farmers
and experts to inspect plants by hand. This involves spotting symptoms on leaves and stems,
which can be slow, subjective, and not feasible on larger farms. Moreover, because many plant
diseases share similar signs, there’s a higher chance of misdiagnosis. This could lead to
unnecessary pesticide use or delays in taking action both of which can negatively impact the
environment and lower crop yields.

With recent advancements in Machine Learning (ML) and Deep Learning (DL), particularly
in image recognition, plant disease detection has become significantly more efficient. Models
such as MobileNetV2, a lightweight and efficient convolutional neural network, have proven
effective in identifying diseases from leaf images with high accuracy[3]. MobileNetV2 is
especially suitable for deployment in mobile and web-based applications due to its balance
between performance and computational efficiency[4]. These models can detect early signs of
disease in mango, potato, and eggplant leaves often before they are noticeable to the human
eye enabling faster and more reliable decision-making. While the integration of ML in
agriculture is still emerging, its impact is growing rapidly. Automated disease detection

9
empowers farmers with timely insights, reducing the spread of infections and minimizing crop
loss. This shift from manual to AI-assisted diagnosis marks a crucial step toward sustainable
and technology-driven farming.

To make these innovations accessible, web-based platforms are being developed where users
can upload leaf images for instant analysis. Such systems democratize access to advanced
agricultural tools, especially for small-scale farmers or those in remote areas. By offering
intuitive interfaces and reliable disease predictions, these applications enable better crop
protection and foster data-driven agricultural practices. There are two types of disease in plants
as shown in Figure 1.1:

Figure 1.1 - Categories of Plant Disease

1.1.1 Biotic diseases: These are caused by living organisms such as fungi, bacteria, viruses,
and nematodes. Biotic stress typically manifests as visible symptoms like leaf spots,
blights, rots, and wilting. Accurate classification of biotic diseases requires datasets
containing clear, labelled examples of each pathogen-induced symptom as shown in
Table 1.1.

10
Table 1.1 - Biotic Plant Disease

Biotic Plant Diseases


S. Disease Pathogen Image
Symptoms
No. Name Type

Water-soaked
Bacterial
1 Bacteria lesions on leaves,
Blight
dark streaks, wilting

White powdery
Powdery
2 Fungi coating on leaves,
Mildew
stems, and flowers

Yellowing leaves,
3 Root Rot Fungi stunted growth,
mushy roots

Small rust-coloured
4 Rust Fungi spots on leaves
and stems

Yellow patches on
Downy leaves, white
5 Fungi
Mildew growth on
undersides

Dark spots on
6 Leaf Spot Bacteria/Fungi leaves with yellow
halos, leaf drop

Sudden wilting,
7 Wilt Disease Bacteria/Fungi yellowing leaves,
stunted growth

Mottled yellow or
green leaves,
8 Viral Mosaic Virus
distorted growth,
reduced yield

11
1.1.2 Abiotic diseases: These are caused by non-living factors such as nutrient deficiencies,
chemical exposure, drought, temperature extremes, or pollution. Unlike biotic diseases,
abiotic stress symptoms often resemble those caused by pathogens, which can complicate
classification. These include yellowing, browning, scorching, or irregular leaf shapes as
shown in Table 1.2.

Table 1.2 - Abiotic Plant Disease

Abiotic Plant Diseases


S. Condition Images
Cause Symptoms
No. Name

Wilting, leaf drop,


Drought
1 Lack of water yellowing or
Stress
browning of leaves

Nutrient Yellowing of older


Soil nutrient
2 Deficiency leaves, stunted
imbalance
(Nitrogen) growth

Blackened or
Frost scorched leaves,
3 Low temperatures
Damage wilting, stunted
growth

White or brown
patches on leaves,
4 Sunscald Excess sunlight
usually on sun-
exposed areas

Chemical Leaf curling,


5 Burn Chemical exposure discoloration,
(Herbicide) distorted growth

Leaf burn, stunted


Salt
6 High soil salinity growth, wilting,
Damage
yellowing

Root suffocation,
Water-
7 Excess water yellowing leaves,
logging
wilting

12
1.2 Problem Statement

The agricultural sector continues to face critical challenges due to the widespread occurrence
of plant diseases, which severely impact crop yield, food quality, and farmer income. Timely
and precisely identification of plant diseases is crucial for effective treatment and crop
management. Traditional approaches, such as manual visual inspections, can be slow, error-
prone, and impractical for large-scale monitoring. These methods often fail to handle the
volume of data required for comprehensive plant health management. This leads to delayed
interventions, misuse of agrochemicals, and significant economic losses.

In countries where agriculture is the main way of making a living and access to agricultural
expertise is limited, there is an urgent need for scalable, accurate, and user-friendly solutions.
Leaf-based visual symptoms are a common and accessible way to diagnose plant diseases, yet
interpreting them correctly requires specialized knowledge that many farmers do not have.
The core problem this project aims to address is the lack of accessible and efficient plant
disease diagnosis tools for key crops such as mango, potato, and eggplant. By leveraging the
lightweight MobileNetV2 deep learning architecture, this project seeks to develop a web-based
application that can automatically identify diseases from leaf images. This system is designed
to provide farmers and agricultural workers with real-time, accurate disease predictions and
solutions, ultimately reducing crop loss and improving sustainable farming practices.

1.3 Motivation

Agriculture remains a critical pillar of global economies, especially in developing regions


where for millions of small-scale farmers, it serves as their main livelihood. However, crop
diseases continue to pose a serious challenge to agricultural productivity. The Food and
Agriculture Organization (FAO) estimates that plant diseases are responsible for as much as
20 to 40 percent of crop losses worldwide each year. This puts global food supplies at risk,
causes major disruptions in agricultural supply chains, and strains farmers' livelihoods.

Early spotting of plant diseases is crucial to stop them from spreading and causing serious
harm, but traditional methods relying on manual inspection are often slow, subjective, and
ineffective at scale. Visual similarities between different diseases can make diagnosis difficult
even for experienced farmers. Moreover, limited access to agricultural experts and diagnostic

13
tools in rural areas further delays timely interventions, leading to lower yields and increased
economic losses.

This project aims to tackle these issues by harnessing modern, user-friendly technology.
Thanks to advancements in Artificial Intelligence (AI) and Machine Learning (ML),
particularly in image classification, we can now create tools that automatically detect plant
diseases through mobile apps or web platforms. Among various models, MobileNetV2 stands
out for being lightweight and optimized for performance on devices with limited computing
power, making it ideal for real-world agricultural applications.

The project aims to harness the capabilities of MobileNetV2 to identify diseases in mango,
potato, and eggplant leaves in real time. By allowing users to simply upload an image of a
diseased leaf, the system can detect the disease and provide relevant treatment suggestions
instantly. This enables timely decision-making and significantly reduces the dependency on
manual diagnosis or expert consultation.

Furthermore, the motivation extends beyond technical advancement it’s about making this
solution accessible and inclusive. By building a user-friendly web interface with multilingual
voice and text support, the system can serve farmers in remote or linguistically diverse areas,
empowering them with knowledge and tools they might not otherwise access. Ultimately, this
project seeks to bridge the gap between cutting-edge AI and grassroots farming needs,
improving both productivity and sustainability.

1.4 Objectives
This project is motivated by the goal of creating a real-time plant disease detection system that
is practical, scalable, and easy to use in everyday agricultural settings. It focuses on achieving
several important objectives:

1.4.1 To study existing methodology and analyse the characteristics of datasets available
on different platforms of various plant disease:
This objective aims to study existing plant disease image datasets to understand their
structure, class distribution, quality, and limitations. The analysis will help identify the
specific requirements for training an efficient MobileNetV2 model for real-time plant
disease detection, focusing on aspects such as image resolution, disease variability, and
environmental conditions.
14
1.4.2 To create and preprocess a high-quality, balanced dataset of real-time plant disease:
This objective involves collecting, labeling, and preprocessing images showing both
healthy and diseased leaves from mango, potato, and eggplant plants. The dataset will be
tailored to meet the input requirements of the MobileNetV2 model, ensuring it is
lightweight, diverse, and suitable for deployment in real-world agricultural settings.

1.4.3 To develop a model to detect plant diseases with MobileNetV2:


This objective aims to build a lightweight and efficient plant disease detection model
using MobileNetV2. The model will classify plant leaf images into disease categories
accurately. MobileNetV2 enables faster predictions with lower computational cost. This
helps in early detection and supports real-time use on mobile devices.

1.4.4 To develop a web-based Streamlit application that offers a user-friendly interface


for real-time plant disease detection:
To understand how well the model performs, we’ll measure it using key indicators such
The application should allow users especially farmers and agricultural workers to upload
leaf images and receive instant feedback on plant health status. Emphasis will be on
simplicity, speed, and accessibility, making the solution usable even for non-technical
users in rural settings.

15
Chapter 2
Literature Survey

2.1 Introduction

Over the years, research has focused on developing techniques to automate plant disease
identification in order to reduce the dependency on manual inspection, relying on manual
inspection not only takes a lot of time but also leaves room for human mistakes. However, with
the advancement of deep learning and image recognition especially through Convolutional
Neural Networks (CNNs), there has been a major move toward automating the process of
detecting plant diseases.

U. Abhishek (2025) [5] conducted a study using various public datasets, including RGB,
multispectral, and hyperspectral images. The research involved advanced deep learning
techniques like Convolutional Neural Networks (CNN), Vision Transformers, and Generative
Adversarial Networks (GANs). The models performed impressively, with the CNN achieving
an accuracy of 99.35%, Inception-v3 scoring 92.60%, and Mask R-CNN getting 78.8%. The
future goal of this work is to develop real-time, AI-driven plant disease detection systems that
can be used effectively in agricultural environments. B. Ananthakrishnan (2025) [6] utilized a
dataset of 5,867 tea leaf images collected from Kaggle and GitHub. The study employed CNN
along with traditional machine learning techniques like OpenCV, Multi-Layer Perceptron
(MLP), Support Vector Machine (SVM), and Decision Trees. The combination of methods
yielded a strong accuracy of 95.06%. The future direction suggested is integrating these models
into mobile applications to help farmers detect diseases in the field more easily. D. Aria (2024)
[7] worked with various public plant disease datasets available on Kaggle. The methodology
involved a blend of CNN, Recurrent Neural Networks (RNN), GANs, and Transformers. The
Deep Convolutional Neural Network (DCNN) achieved 99.35% accuracy, while a simpler
smartphone-optimized model scored 65%. This work aims to improve AI models to support

16
real-time crop monitoring by integrating contextual data for better field applications. K.
Priyanka (2024) [8] focused specifically on rice leaf diseases using datasets sourced from
Kaggle. A CNN model was employed to identify the diseases, which yielded an accuracy of
95%. The researcher proposes future improvements in the model’s efficiency, particularly to
make it more applicable for real-world agricultural scenarios.

Bahaa S (2023) [9] used a large dataset consisting of 38 classes of both infected and healthy
crop leaves sourced from GitHub. The model used was EfficientNetV2S, a powerful neural
network known for balancing accuracy and speed. This model achieved an accuracy of 95.01%.
Future work involves extending the trained model to handle a wider variety of crops and
environmental conditions. J. S. Diana (2023) [10] developed separate datasets for rice, wheat,
and maize diseases using images from Google. The researcher used a combination of Xception,
MobileNet, and InceptionV3 models. The models performed well with accuracies of 95.80%
for maize, 97.28% for rice, and 96.32% for wheat. Future work involves applying these deep
learning models to other crops for broader agricultural application. M. Emmanuel (2023) [11]
utilized a robust dataset comprising 5,170 real-world field images and 8,629 annotated leaves
across 27 disease classes, sourced from Kaggle. Models like MobileNet, VGG16, InceptionV3,
YOLOv3, and others were implemented using transfer learning. The MobileNet model
achieved the highest accuracy of 99.69%, while EfficientNetV and DenseNet121 also
performed excellently. The study recommends expanding the dataset and developing UAV-
based and mobile systems for real-time disease detection.

K. Faiza (2023) [12] collected maize leaf images from the University Research Farm Koont
(PMAS-AAUR), focusing on three maize diseases. A series of deep learning models, including
YOLOv3-tiny to YOLOv8n, were used. The final models performed with a high accuracy of
99.04%. The researcher aims to integrate the detection model into a mobile application for real-
time monitoring and tracking in the field. K. Jameer (2023) [13] relied on multiple datasets
including the popular PlantVillage dataset, all sourced from GitHub. The study experimented
with various models such as SVM, Bayes Discriminant Function, and a custom MDC
algorithm. The Bayes function achieved 98.32% accuracy, and the MDC model achieved
95.71%. The goal is to implement these models into real-world agricultural systems and
potentially develop smartphone-based diagnostic tools.

C. Md. Jalal Uddin (2023) [14] compiled a large dataset of 17,430 images from Kaggle,
17
covering bell pepper, tomato, and potato leaves with 14 distinct disease classes. Multiple
machine learning models including CNN, KNN, Logistic Regression, SVM, and Decision Tree
were tested. CNN led with 99% accuracy, while others like SVM and Logistic Regression
lagged. The study emphasizes future expansion to include multi-seasonal, real-world image
variations and AI-powered treatment suggestions. M. Shoaib (2023) [15] used the extensive
PlantVillage dataset, which includes around 38,000 images across 14 different crops. This
work employed a variety of deep learning models including AlexNet, VGG16, VGG19,
ResNet, InceptionV3, DenseNet, YOLOv3, SSD, and advanced object detection frameworks
like Faster R-CNN and Mask R-CNN. The results were impressive, with a top accuracy of
99.97%. The study envisions applying the models in real-world agricultural systems and
creating smartphone-based diagnostic tools for farmers. S. B. Tej (2023) [16] used datasets
collected from UAV-based (drone) remote sensing of various crops. The models tested
included Faster R-CNN, YOLOv5, ResNet50 combined with SVM, and U-Net for
segmentation. These models achieved high accuracy, such as 97.86% with ResNet50 + SVM,
and 96% with YOLOv5. The study aims to develop automated agricultural monitoring systems,
especially using drones for early disease detection.

C. Jackulin (2022) [17] worked with images collected from actual agricultural fields to reflect
real-world conditions. The methodology included CNN along with traditional ML models like
SVM, Random Forest, Decision Tree, KNN, and Naive Bayes. The CNN-based models
achieved high accuracy, with the top result being 98.56%. The research points toward
expanding datasets with real-world images and AI-based treatment recommendations. J. Arun
Pandian (2022) [18] used a massive dataset of 147,500 images covering 58 classes of plant leaf
diseases. The models applied were 14-layered DCNNs and other CNN-based architectures,
achieving 99.43% accuracy. The future scope of the study is focused on model automation and
using AI for early-stage disease detection. P. Jagamohan (2022) [19] focused on rice leaf
diseases, using a dataset of 5,932 images. The study used CNN-based methods and obtained
an accuracy of 97.20%. The main goal is to deploy the model on edge devices and integrate
real-time use in agricultural fields. M. Rabbia (2022) [20] targeted potato leaf diseases using a
dataset of 2,152 images sourced from Kaggle and Google. The study applied basic CNN
models, achieving accuracies such as 99.6% with CNN, 97% with K-NN, and 88% with SVM.
The future scope involves adapting the model to human activity recognition and expanding it
to other domains.

18
H. Sunil S. (2022) [21] used tomato leaf images from Kaggle showing various disorders. The
model architecture used was CNN, and the best-performing model reached 94% accuracy. The
study aims to extend the disease detection system to other plant species and deploy a mobile-
based detection system. A. A. Ahmed (2021) [22] compiled a very large dataset of 96,206
images from Kaggle, PlantVillage, and Google. Models like VGG16, VGG19, InceptionV3,
and VGG19 with logistic regression were applied. The VGG19 + Logistic Regression approach
achieved the highest accuracy of 97.8%. Future plans include UAV-based deployment for
aerial crop monitoring. Kowshik B (2021) [23] collected plant leaf images manually using a
digital camera. The study used traditional machine learning models like Naive Bayes, Decision
Tree, KNN, SVM, and Random Forest. The best model, Random Forest, achieved 79.23%
accuracy. The focus is now on developing mobile apps and improving models using transfer
learning and other advanced techniques. L. Lili (2021) [24] used a combination of public
datasets such as PlantVillage and AI Challenger 2018. Deep learning models like ResNet-50,
DenseNet-121, GoogLeNet, and others were tested. The DenseNet-121 model yielded the
highest accuracy of 99.75%. The study plans to build open-access plant disease image
repositories and optimize training time while maintaining high performance.

S. Rahul (2021) [25] worked with two separate datasets: 5,932 images for rice leaves and 1,500
for potato leaves. The proposed CNN model achieved high accuracy 99.58% for rice and
97.66% for potato. The study emphasizes the model’s effectiveness and recommends
expanding its usage to other crop types and deploying it in mobile applications for real-time
use by farmers. P. Shanta (2021) [26] used multiple plant leaf disease datasets from Kaggle
and implemented deep learning models like AlexNet, SqueezeNet, and a custom CNN model.
The custom CNN reached 98% accuracy. The study also explored hybrid approaches such as
combining image segmentation with SVM or backpropagation neural networks, showing high
potential. The goal is to enhance classification and build real-time apps for farmers. T.
Divyansh (2020) [27] used 2,152 potato leaf images from the PlantVillage dataset. The models
evaluated include SVM, Naive Bayes, KNN, Decision Tree, and Random Forest. The best
result came from Random Forest with 79.23% accuracy. The study suggests improving dataset
diversity and building real-time farmer tools as future work. P. P. Kshyanaprava (2020) [28]
focused on maize leaf images for classification purposes. While the dataset was relatively
simple, it still contributed to understanding model effectiveness. A range of models were
applied including CNN, ResNet34, and transfer learning. The accuracy peaked at 96.21%. The
future aim is to deploy real-time monitoring using UAVs and expand functionality.
19
M. Omkar (2020) [29] used the comprehensive PlantVillage dataset with 54,444 images.
Models like YOLOv3 and ResNet18 were applied, achieving 96% accuracy with ResNet18.
The future plan includes extending the model to provide additional services like nearby market
listings for farmers.V. Aravindhan (2019) [30] used PlantVillage dataset from GitHub and
integrated it with real-world applications using deep learning models such as VGG, ResNet,
and SqueezeNet. High accuracy (up to 97.86% with ResNet50) was achieved. The researcher
proposes integrating transfer learning and hyperspectral imaging to boost early detection
capabilities. A. Kawcher (2019) [31] collected rice leaf images from agricultural research
centers. The models implemented included AlexNet, VGG16, ResNet, etc., and they achieved
strong results with accuracies reaching up to 99.19% with ResNet50. The study suggests using
nature-inspired algorithms to further enhance performance. T. Muammer (2019) [32] gathered
real-world plant images from agricultural fields in various regions of Turkey. The work
employed multiple CNN models and aimed at integrating AI for early disease detection. The
models achieved up to 98.59% accuracy, and the goal is to develop mobile-based detection
systems with high efficiency.

S. H. Muhammad (2019) [33] used both public and custom agricultural datasets. Techniques
like CNN, VGG, Inception, and ResNet were implemented. Results ranged from 76% to
99.84%, depending on the model. The future plan includes reducing complexity for IoT-based
smart farming systems. M. Sammy V. (2019) [34] created a large dataset of 35,000 images
covering various crops like apple, corn, and tomato. Using models such as Faster R-CNN, SSD,
and R-FCN, the study reached accuracies between 90% and 97%. The aim is to improve model
generalization and collect more diverse, real-world images. N. Sapna (2019) [35] worked with
PlantVillage and custom real-world datasets. Using CNN-based deep learning models such as
LeafNet and PlantDiseaseNet, the best performance was from DenseNet-121 with 99.75%
accuracy. Future work involves expanding datasets and field-based deployments.

This chapter explores various research studies and existing systems that aim to tackle plant
diseases, discusses the methodologies employed, and identifies the gaps that this project aims
to address. Based on the research done the survey is given below. Table 2.1 shows the research
work.

20
Table 2.1 - Literature Survey

S. Author Results (in


Dataset used Methodology Future Scope
No. Name accuracy)
U. Various public CNN, Vision CNN: Development of
Abhishek( datasets including Transformers, 99.35% real-time AI-driven
2025) [5] RGB, R-CNN R-CNN: disease
1 multispectral, and 78.8%
hyperspectral
images.
B. 5,867 tea leaf CNN, CNN: Integration with
Ananthak images from OpenCV, 95.06% mobile applications
2 rishnan(2 Kaggle and MLP, SVM, for field-level
025) [6] Github and Decision implementation
Tree
D. Aria Various public CNN, RNN, CNN: Improved AI
(2024) [7] datasets of plant GAN, and 99.35% models for real-
disease images Transformers. time crop
3 from Kaggle monitoring.
Integration of
contextual data
K. Kaggle dataset CNN model CNN: 95% Improving
Priyanka with various rice detection efficiency
4 (2024) [8] leaf diseases for real-world
agricultural applicat
ions
Bahaa 38 classes of EfficientNetV EfficientNe Extending the
S(2023) infected and 2S tV2S : model to additional
5 [9] uninfected crop 95.01% crops and
leaves from environmental
Github conditions
J. S. Developed Xception, Maize,Rice Applying the
Diana(20 datasets for rice, MobileNet : Xception proposed CNN
23) [10] wheat, and maize (95.80%) model to other
6 diseases from Wheat: plant leaf diseases
Google MobileNet
V2
(96.32%).

M. 5,170 field plant MobileNet, MobileNet: Expansion of


Emmanue images. 8,629 VGG16, 99.69% dataset with more
l (2023) annotated leaves InceptionV3, YOLOV: crops. Development
[11] across 27 disease InceptionResN 79.19% of mobile and
7
classes from etV2, InceptionV UAV-based real-
Kaggle YOLOV3 3: 98.56% time detection
systems.

21
K. Collected from YOLOv3-tiny, 99.04% Integration of the
Faiza(202 the University YOLOv4, model into a mobile
3) [12] Research Farm YOLOv5s, application for real-
Koont, PMAS- YOLOv7s, time disease
8 AAUR; includes and detection and
three maize YOLOv8n. tracking
diseases.

K. Multiple datasets SVM, Bayes SVM: 93% Improved DL


Jameer(20 including Plant Discriminant MDC models for
23) [13] Village from Function, Algorithm: changing severity
Github MDC 95.71% of plant diseases.
algorithm. Bayes
9
Discrimina
nt
Function:
98.32%

C. Md. 17,430 images CNN, KNN, CNN: 99% Application in real-


Jalal collected from Logistic SVM: world agricultural
Uddin Kaggle, covering Regression, 66.15% monitoring
(2023) bell peppers, SVM, and KNN: systems.
[14] tomatoes, and Decision Tree. 41.25% Development of
10
potatoes. Logistic smartphone-based
14 separate Regression: diagnostic tools.
disease classes 42.43%

M. PlantVillage R-CNN R-CNN: Expanding datasets


Shoaib(20 (contains 38,000 97% with multi-
11 23) [15] images across 14 environment, multi-
crops) season images,

S. B. Various UAV- Faster R-CNN, Faster R- Improving model


Tej(2023) based remote YOLOv5, CNN: automation and
[16] sensing datasets ResNet50 + 95.0% integrating AI for
covering different SVM YOLOv5: early disease
12 crops 96.0% detection
ResNet50 +
SVM:
97.86%

C. Real-world CNN, SVM, CNN: Deploying models


Jackulin(2 collected images Random 98.56% on IoT and UAV-
022) [17] from agricultural Forest, SVM: based monitoring
fields Decision Tree 93.20% systems.
Random
13 Forest:
90.78%
Decision
Tree:
87.45%

22
J. Arun 147,500 images 14-DCNN 14-DCNN: Applying the model
Pandian(2 covering 58 99.97% to other plant
022) [18] different plant species and
leaf classes optimizing it for
14
real-world
agricultural
applications.

P. 5,932 images of CNNs, CNN: Deployment on


Jagamoha diseased rice Darknet53- 99.43% edge devices for
n (2022) leaves. SVM model real-time
[19] agricultural use,
Integration of IoT-
15
based sensors for
real-time field
applications.

M. 2,152 images for CNN CNN: The model can be


Rabbia(20 Potato leaves 97.20% adapted for human
22) [16] diseases from disease detection,
16 Kaggle, Google activity recognition
in surveillance.

H. Sunil Tomato leaf CNN, KNN, CNN : Extending the


S. (2022) samples with SVM 99.6% model to other
[21] various disorders K-NN: plant species and
from Kaggle 97% deploying a mobile-
17
SVM: 88% based detection
system.

A. A. 96,206 images CNN CNN: 94% Deployment on


Ahmed(2 from Kaggle, UAVs for aerial
021) [22] Plant Village, and crop monitoring
18
Google Web
Scraper.

Kowshik Images of plant CNN, DNN CNN: Expansion to


B(2021) leaves, collected 95.6% provide additional
[23] using a digital DNN: features like market
19 camera. 93.2% price lists and
nearby open
markets.

L. Lili Various public CNN, CNN: Integrating transfer


(2021) plant disease VGG16, 96.50% learning and
[24] datasets, ResNet. VGG16: hyperspectral
including 97.12% imaging for early
20 PlantVillage and ResNet: disease detection
AI Challenger 99.19%
2018

23
S. Rahul Rice Leaf CNN CNN: Using nature-
(2021) Dataset: 5,932 98.65% inspired algorithms
[25] images to improve CNN
21 Potato Leaf model performance.
Dataset: 1,500
images
P. Shanta Multiple plant Convolutional CNN: 98% Integrating AI-
(2021) leaf disease Neural AlexNet: based early disease
[26] datasets from Network 95.65% prediction models
Kaggle (CNN), SqueezeNet
22 AlexNet and : 94.30%
SqueezeNet
for feature
extraction
T. PlantVillage VGG16, VGG16: Developing a
Divyansh Dataset 2152 VGG19, 92% Mobile-Based
(2020) images of potato InceptionV3 VGG19 Disease Detection
[27] leaves :97.8% System, Improving
23 InceptionV Dataset Quality and
3: 94.30% Optimizing
Computational
Efficiency
P. P. Images of maize Naive Bayes, Naive Enhancing
Kshyanap leaves used for Decision Tree, Bayes: classification
rava classification K-Nearest 77.46% accuracy and
(2020) Neighbor, Decision developing a real-
[28] Support Tree : time application for
Vector 74.35% farmers
Machine, and KNN:
24
Random 76.16%
Forest SVM:
77.56%
Random
Forest (RF)
: 79.23%

M. Omkar PlantVillage ResNet-34, ResNet-34: A web app was


(2020) Dataset Transfer 96.21% developed to allow
[29] Total Images: learning users to upload
25 54,444 images images and detect
diseases in real-
time.
V. PlantVillage VGG16, Simple: Developing a real-
Aravindh Dataset (from ResNet18 50.5% time video-based
an(2019) GitHub) VGG16: system for
[30] 50.26% unattended plant
ResNet18: monitoring,
26 96% Enhancing the
model to suggest
treatments for
detected diseases

24
A. Rice leaf images Decision Tree Decision Extending the
Kawcher( dataset collected KNN, Logistic Tree: model to mobile
2019) from agricultural Regression, 94.91% applications for
[31] research centers Naïve Bayes KNN: farmers,
98.84% Implementing
Logistic CNN-based models
Regression for automated
27
75.46% feature extraction.
Naïve
Bayes
58.80%

T. Images collected AlexNet, AlexNet: Developing a real-


Muammer from real-world VGG16, 95.5% time mobile app for
(2019) agricultural fields GoogleNet, VGG16: farmers. Reducing
[32] in Turkey ResNet101 95.0% computational
(Malatya, Bingöl, GoogleNet: complexity for IoT-
28
Elazığ regions) 95.22% based smart
ResNet101: farming
97.45% applications.

S. H. Various datasets CNN-Based AlexNet: Improving model


Muhamm including Models: 95.65% generalization for
ad (2019) PlantVillage and AlexNet, GoogLeNet real-world
[33] custom GoogLeNet, : 99.35% applications.
agricultural ResNet-50, ResNet-50: Collecting more
29
datasets DenseNet-121 99.19% diverse images with
DenseNet- real-world
121: variations.
99.75%

M. 35,000 images of CNN CNN: Expanding the


Sammy healthy and 96.50% dataset and
V.(2019) diseased plant deploying the
[34] leaves Covers model for field-
apple, corn, based applications.
30
grapes, potato,
sugarcane, and
tomato

N. Public datasets CNN, CNN : Developing an


Sapna(20 such as AlexNet, 99.84% open-access image
19) [35] PlantVillage, GoogleNet, SVM and repository for plant
Custom datasets VGGNet, other ML diseases, Reducing
31
collected in real- ResNet, models: training time while
world condition. Inception- 79.5– maintaining high
ResNet 97.2% accuracy.

25
J. PlantVillage CNN CNN: Extending the
Ashwin(2 dataset with four 98.59% system to detect
018) [36] disease categories more plant diseases
(Bacterial Spot, and real-time
Yellow Leaf Curl implementation for
32
Virus, Late farmers
Blight, and
Healthy Leaf)

A. B. 50,000 images CNN, transfer CNN: 76% Improving dataset


Jayme freely available learning diversity and
Garcia(20 for research. expanding beyond
33
18) [37] leaf images

M. Images of Faster R-CNN, Faster R- Enhancing dataset


Akila(201 diseased and R-FCN, SSD CNN: quality, Automated
8) [38] healthy leaves 95.66% Disease Detection
34 from commercial, R-FCN: System and
cereal, vegetable, 93.96% Improving Model
and fruit crops SSD: Efficiency.
90.94%

2.2 Research Gap


2.2.1 Lack of Real-World and Diverse Data
Many studies rely heavily on structured datasets like PlantVillage, which are collected
under controlled environments. There is a significant gap in using real-world, diverse,
and noisy field images captured under different lighting, backgrounds, and seasons.

2.2.2 Limited Crop and Disease Coverage


While some research focuses on major crops like rice, tomato, and potato, less attention
is given to regional or less common crops. There’s a need for expanding the datasets and
models to support a wider variety of crops and disease types.

2.2.3 Underutilization of Multimodal Data (e.g., Weather, Soil, GPS)


Most approaches rely solely on image data. There is a gap in integrating contextual or
multimodal data such as temperature, humidity, or soil health, which could improve
prediction accuracy and decision-making in real farming conditions.

2.2.4 Limited Real-Time and Mobile Deployments

26
Despite high model accuracies, few models are optimized or deployed in real-time on
mobile or edge devices. There is a practical gap in making these systems accessible and
usable for farmers in the field.

2.2.5 Scarcity of Transfer Learning and Lightweight Models


While some studies explore transfer learning, many still use heavy architectures. There's
a gap in research on lightweight, efficient, and interpretable models that can run on low-
resource devices, especially for use in rural settings.

2.3 Dataset Collection

The dataset consists of 21 distinct classes encompassing a range of plant diseases and healthy
leaf conditions across three major crops: mango, potato, and eggplant. The images were
captured in natural lighting using a 48-megapixel rear camera on an iPhone, directly from rural
agricultural fields. This real-world data collection ensures that the images reflect authentic
environmental conditions, including variations in lighting, background, and leaf orientation,
making the dataset valuable for training robust and practical plant disease detection models.
Unlike many existing datasets that rely on controlled environments or online sources, this
dataset emphasizes field-level diversity and naturally occurring features, enhancing the
generalizability of the trained model. Each class contains a balanced distribution of healthy
and infected samples, aiding in the model’s ability to learn disease-specific patterns.
Additionally, care was taken to avoid duplicate or overly similar images to maintain dataset
integrity. The dataset underwent manual annotation and verification by domain experts to
ensure labeling accuracy.

According to Table 2.2, the dataset includes pictures of three different plant species: potatoes,
mangos, and eggplant. Images of both healthy and diseased leaves are included in each plant
category to represent a range of common plant health conditions. Total all images are 7983 in
the dataset. Classifications of eggplant data include pest, blight, wilt, and healthy. Mango
contains both healthy samples and a wider variety of illnesses, such as bacterial and fungal
infections. With a broad range of diseases brought on by bacteria, viruses, fungi, and pests,
potatoes have the most classes. This varied dataset is a useful tool for developing machine
learning models for the identification and categorisation of plant diseases.

27
Table 2.2 - Dataset Images Classification

Number of Number of Total


S. Name of
Class Name images per images per Number of
No. Plant
class plant images
Healthy 692
Leafspot Blight 426
1 Eggplant 1710
Pest 361
Wilt 231
Anthracnose 278
Bacterial Canker 190
Die Back 189
Grey Blight 49
2 Mango 1075
Healthy 139
Leaf Blight 72
Mango Scab 58 7983
Powdery Mildew 100
Early blight 1155
Late Blight 660
Bacteria 499
Fungi 790
3 Potato Healthy 703 5198
Nematode 69
Pest 667
Phytophthora 234
Virus 421

Figure 2.1 shows leaf samples grouped by plant category and how they look under different
conditions. The conditions presented for potato leaves are Potato Pest, Potato Healthy, Potato
Nematode, Potato Virus, Potato Early Blight, Potato Phytophthora, Potato Late Blight, Potato
Bacteria and Potato Pest. Then there are also eggplant leaves included, with Eggplant Healthy,
Eggplant Pest, Eggplant Leaf Spot Blight and Eggplant Wilt. When picking mango leaves,
conditions including Mango Die Back, Mango Bacterial Canker, Mango Leaf Blight, Mango
Scab, Mango Powdery Mildew, Mango Grey Blight, Mango Anthracnose and Mango Healthy.

28
Figure 2.1 - Sample leaves Dataset used
2.3.1 Potato Virus:
Potato Virus infection is a common and serious condition that can be caused by a variety
of viruses, including Potato Virus Y (PVY), Potato Virus X (PVX), and Potato Leafroll
Virus (PLRV). These viruses primarily infect plants through the use of infected seed
tubers and are transmitted by vectors such as aphids, especially Myzus persicae (green
peach aphid). Once inside the plant, the virus replicates in plant cells and spreads
through plasmodesmata to other parts of the plant, disrupting normal physiological
functions.

2.3.2 Potato Pest:


Potato plants are commonly affected by a wide range of insect pests that damage leaves
and reduce photosynthetic capacity. Among the most notorious pests is the Colorado
potato beetle (Leptinotarsa decemlineata), which has a strong resistance to many
29
chemical insecticides. These beetles, along with other leaf-eating insects like aphids,
cutworms, flea beetles, and thrips, can cause extensive damage. The Colorado potato
beetle larvae and adults feed voraciously on the foliage, often skeletonizing leaves and
leaving only the veins.

2.3.3 Potato Bacterial Disease:


Bacterial infections in potato plants are often caused by pathogens such as Ralstonia
solanacearum, Pectobacterium carotovorum (soft rot), and Clavibacter michiganensis
(ring rot). These bacteria enter the plant system through wounds, natural openings, or
root tips, especially when the soil is overly moist or poorly drained. The bacteria
multiply rapidly and disrupt the plant's vascular system by producing extracellular
polysaccharides that block xylem vessels, thus inhibiting water transport. This results
in wilting, leaf yellowing, blackened or decayed stems, and in severe cases, the entire
plant collapsing.

2.3.4 Potato Fungal Disease:


Fungal infections in potato plants are caused by several species including Alternaria
solani (Early Blight), Phytophthora infestans (Late Blight), and Fusarium spp. These
fungi release spores that land on leaves, germinate in the presence of moisture, and
penetrate plant tissues, causing necrotic lesions and chlorosis. Symptoms vary by
pathogen but generally include brown or black spots with concentric rings, yellowing
margins, and progressive decay of the leaf surface.

2.3.5 Potato Nematode:


Potato nematodes, particularly the root-knot nematodes (Meloidogyne spp.) and potato
cyst nematodes (Globodera rostochiensis and Globodera pallida), are microscopic
roundworms that infest the roots and rhizosphere of the plant, but their effects often
show up in the leaves. Though invisible to the naked eye, nematodes cause substantial
internal damage by penetrating root tissues and feeding on the plant's cells, which leads
to a compromised root system and ultimately affects nutrient and water uptake.

2.3.6 Potato Early Blight:


Early blight in potatoes, caused by the fungal pathogen Alternaria solani, is one of the
most common foliar diseases affecting solanaceous crops. The fungus survives in crop
30
debris and soil, and under conditions of high humidity and warm temperatures, it
releases spores that are carried by wind and water. Initial symptoms appear as small,
brown to black necrotic lesions typically surrounded by a yellow halo. These spots
expand into concentric ring patterns that resemble a target, eventually coalescing and
causing the leaf to wither and die.

2.3.7 Potato Healthy:


A healthy potato leaf is a sign of vigorous growth and absence of disease or stress. It
should be uniformly green, turgid, and free from any visible lesions, spots, chlorosis,
or curling. The leaf surface will be smooth with clearly defined venation and no
evidence of insect feeding or fungal spore formation. A healthy leaf indicates optimal
levels of sunlight exposure, water, and nutrient availability. It also suggests that the
plant is effectively managing environmental conditions such as temperature and
humidity.

2.3.8 Potato Phytophthora (Late Blight Early Stage):


This condition is caused by Phytophthora infestans, the same notorious oomycete that
was responsible for the Irish Potato Famine in the 19th century. The disease often
begins as water-soaked lesions that appear on the leaf edges or tips. These lesions
rapidly enlarge, becoming dark brown to black and surrounded by a pale green halo.
Under humid conditions, white fungal growth may be visible on the underside of the
leaves. The pathogen thrives in cool, moist environments and spreads rapidly,
especially during periods of prolonged leaf wetness. It can destroy entire fields in a
matter of days. Spores can travel through wind or splash from rain or irrigation.

2.3.9 Potato Late Blight:


Late blight is the advanced stage of the Phytophthora infestans infection and is far more
devastating. As the disease progresses from early symptoms, large necrotic lesions
dominate the leaves, leading to complete defoliation and death of the foliage. The
disease affects all above-ground parts and, in severe cases, reaches the tubers through
infected stems, leading to tuber rot, which continues in storage and results in significant
post-harvest losses. Leaves turn dark brown or black and may show water-soaked
edges. Infections are often accompanied by white mold-like growth on the underside of
leaves.
31
2.3.10 Eggplant Healthy:
A healthy eggplant leaf is large, ovate, and covered in fine hair-like trichomes. Its
coloration is a rich green with strong, visible venation, and it shows no signs of curling,
necrosis, spotting, or insect damage. Healthy leaves are vital for eggplants because they
drive high rates of photosynthesis, supporting fruit development and plant growth. Leaf
health is maintained through balanced soil nutrition especially nitrogen, phosphorus,
and potassium and protection from environmental stresses like extreme heat, drought,
or pest infestations.

2.3.11 Eggplant Pest:


Eggplants are highly susceptible to a wide array of insect pests that primarily target
their foliage and can significantly hinder their growth, productivity, and fruit quality.
Among the most devastating pests is the Leucinodes orbonalis, commonly known as
the Brinjal fruit and shoot borer, whose larvae bore into shoots and fruits, causing
wilting and internal damage. Other common pests include aphids (Aphis gossypii),
whiteflies (Bemisia tabaci), thrips, and spider mites, all of which attack the leaves by
sucking plant sap.

2.3.12 Eggplant Leaf Spot Blight:


Eggplant Leaf Spot Blight is primarily caused by the fungus Cercospora melongenae
and other similar pathogens. This disease is characterized by the appearance of small,
circular to irregular brown or dark lesions scattered across the leaf surface. These
lesions may have a yellowish halo and may coalesce into larger patches, leading to
premature leaf senescence. As the disease progresses, the affected leaves dry, curl, and
drop, resulting in severe defoliation and reduced fruit production.

2.3.13 Eggplant Wilt:


Eggplant wilt is often caused by either fungal pathogens like Fusarium oxysporum f.
sp. melongenae (Fusarium wilt) or Verticillium dahliae (Verticillium wilt), and
occasionally bacterial wilt caused by Ralstonia solanacearum. Fusarium wilt is soil-
borne and infects plants through the root system, moving up through the xylem where
it clogs vascular tissues and restricts water and nutrient transport. Symptoms first
appear as a sudden yellowing and wilting of lower leaves, followed by drooping,
32
necrosis, and collapse of the entire plant even in the presence of sufficient soil moisture.

2.3.14 Mango Die Back:


Mango Die Back is a complex disease syndrome, typically caused by the fungal
pathogen Lasiodiplodia theobromae. It leads to progressive drying and necrosis starting
from the tip of a branch and gradually moving backward toward the main trunk, giving
the appearance that the branch is “dying back.” Affected leaves turn brown, curl up,
and drop prematurely, and black lesions can be seen on the bark of twigs and branches.

2.3.15 Mango Bacterial Canker:


Bacterial Canker in mangoes is caused by Xanthomonas campestris pv.
mangiferaeindicae, a highly aggressive bacterial pathogen. This disease primarily
affects leaves, twigs, and fruits. Symptoms on leaves appear as angular, water-soaked
spots that become dark brown and are often surrounded by a yellow halo. As the
infection progresses, the spots coalesce, causing large necrotic areas, leaf curling, and
eventual defoliation. On fruits and branches, the bacteria cause raised, corky lesions
that can crack open, oozing a gummy exudate.

2.3.16 Mango Leaf Blight:


Mango Leaf Blight, commonly caused by Colletotrichum gloeosporioides, results in
the appearance of irregular, dark brown patches on the leaves which eventually lead to
complete leaf necrosis and shedding. These lesions can appear along the leaf margins
or scattered across the blade and may be accompanied by small black fruiting bodies of
the fungus. The pathogen survives in infected plant debris and is spread through wind
and rain.

2.3.17 Mango Scab:


Mango Scab is caused by the fungus Elsinoë mangiferae and primarily affects leaves,
twigs, and immature fruits. On leaves, it appears as small, irregular gray or brown
lesions often surrounded by a yellowish margin. These spots may merge to form larger
scabby patches, leading to premature leaf drop. On fruits, scab causes unattractive
surface blemishes that reduce their market value. The disease thrives in moist
conditions, especially during flowering and early fruit set.

33
2.3.18 Mango Powdery Mildew:
Powdery mildew on mango, caused by Oidium mangiferae, is a widespread and
economically significant disease that affects leaves, panicles, and young fruits. It is
characterized by the presence of a white, powdery fungal growth on the surface of
leaves and inflorescences. This growth consists of fungal mycelium and spores which
give the plant parts a dusty appearance. Infected leaves may curl, dry, and fall off. When
panicles are infected, flower drop occurs, leading to poor fruit set and low yields.

2.3.19 Mango Grey Blight


Mango Grey Blight is caused by Pestalotiopsis mangiferae. It affects leaves, causing
greyish to dark brown necrotic lesions that are often bordered by a yellow halo. The
disease may progress to cause extensive leaf drying and defoliation, especially in
younger trees or densely planted orchards. The pathogen survives in infected plant
debris and spreads via conidia dispersed by wind and rain.

2.3.20 Mango Healthy


A healthy mango leaf is typically long, lanceolate in shape, and dark green in color with
a leathery texture and smooth surface. It should be free from any form of discoloration,
spotting, curling, necrosis, or fungal growth. Healthy leaves indicate that the tree is
well-nourished, free from pest or pathogen attacks, and growing in optimal soil, water,
and climatic conditions..

2.3.21 Mango Anthracnose


Mango Anthracnose is one of the most severe fungal diseases affecting mango crops
and is caused by Colletotrichum gloeosporioides. The disease affects leaves, stems,
flowers, and fruits at all growth stages. Symptoms on leaves appear as irregular, black
or brown necrotic spots that may coalesce, causing large dead patches and premature
leaf.

34
Chapter 3
Proposed Methodology

3.1 Introduction
This project focuses on creating a real-time system that uses deep learning to identify plant
diseases from images.
Developed with the farmers and agricultural professionals in mind, the system provides quick
yet accurate diagnoses that can easily be obtained through a web browser. This means it is
possible to react quickly to early signs of an outbreak and thus reduce the impact it will have
on the crops being grown. The engineered Press model is built upon MobileNetV2, which is a
light-weight architecture which fits well on-powerful devices.
The model has been trained on a set of diseases affecting three major crops namely sugarcane,
cassava and banana that are grouped in; mango, potato, and eggplant. This method
encompasses data and image acquisition and processing, model training and testing, as well as
the deployment of a Streamlit web application to host the trained algorithm.
The whole system is designed to be fast, easily scalable, and usable even in areas that have
marginal internet connection. Thus, the presented methodology of leveraging AI to address
real-life agricultural issues can be a strong basis for smart farming and precision agriculture.

3.2 Problem Formulation


Plant diseases present a persistent and growing challenge to agricultural productivity,
threatening food security and the economic well-being of farmers, especially in developing
regions. Traditional methods for detecting diseases often depend on experts manually
analyzing data, which takes a lot of time and can be inconsistent, hard to scale, and not easily
accessible to everyone. These methods are particularly inadequate in remote or resource-
constrained areas where farmers lack timely access to agronomists or diagnostic facilities.
The core problem addressed in this project is the automated, real-time identification of plant
diseases from leaf images using machine learning. Specifically, the focus is on four widely
cultivated crops mango, potato, and eggplant which are prone to multiple types of diseases,
35
both biotic (caused by pathogens) and abiotic (caused by environmental stress).
The problem can be mathematically framed as a multi-class image classification task, where:
Let I denote the input image of a plant leaf.
Let Y denote the output class label, where 𝒀 ∈ {𝑪𝟏, 𝑪𝟐, . . . , 𝑪𝟐𝟏} and each Ci represents a
unique disease or healthy condition from one of the four crops.
Additional constraints in the problem include low latency for real-time prediction, low model
size for deployment on web or mobile devices and robustness to noise (lighting, background
variation).
Thus, the formulated problem not only focuses on building a performant ML model but also
on integrating it into an intuitive and practical application for real-world agricultural use

3.3 Proposed Work


Traditional methods for detecting diseases often depend on experts manually analyzing data,
which takes a lot of time and can be inconsistent, hard to scale, and not easily accessible to
everyone. The system focuses on four major crops mango, potato, and eggplant and is trained
to classify leaf images into 21 distinct classes, including both healthy and diseased conditions.
These diseases are mainly triggered by living organisms like fungi, bacteria, and viruses, and
if they are not identified in time, they can greatly reduce crop production. A key strength of
this project lies in the manual collection of plant leaf images using a smartphone, ensuring the
dataset reflects real-world agricultural environments. Unlike studies that rely solely on online
or lab-based datasets, this hands-on approach improves the model’s reliability and relevance
in practical field conditions.
Figure 3.1 presents the full workflow, starting with image collection and preparation, moving
through model training and prediction, and ending with deployment. The result is a Streamlit-
based web application designed for farmers, offering an easy-to-use interface with real-time
predictions, voice support, and treatment recommendations.

36
Figure 3.1 - Proposed Work flowchart

3.3.1 Data Collection (Using Smartphone – Real-Time Manual Dataset):


This phase is the most foundational and critical because it sets the tone for the entire model's
quality and applicability. In our project, data collection was not done by downloading or
copying datasets from any source. Instead, it was performed manually, by capturing leaf
images in real-time using a smartphone camera. This step involved going to fields, farms, or
gardens where actual mango, eggplant (brinjal), and potato plants were growing.
Images were taken under different natural conditions varying lighting (sunlight, shade,
cloudy), backgrounds (soil, grass, other leaves), and angles (top-view, side-view, zoomed-in,
etc.) to reflect the kind of inputs a real user might provide. These variations added richness
and robustness to the dataset, allowing the model to generalize better when deployed.

3.3.2 Data Identification & Classification:


After collection, each image was carefully observed and labeled by the project team. We
manually identified whether the leaf shown in the image was healthy or diseased, and in case
of disease, the specific type of disease was also classified.
This step involved not only visual inspection but possibly also consultation with agriculture
textbooks, online agricultural disease guides, or expert opinions to ensure the labels were
accurate
This classification process was fully hands-on, and no automated or third-party software was
37
used again reinforcing the originality and credibility of our dataset. The result is a custom-
labeled dataset that truly represents the conditions in which actual farmers operate.

3.3.3 Data Preprocessing:


Raw images captured using a phone can vary widely in size, clarity, and background noise.
Hence, the preprocessing step was critical to prepare the data for training the MobileNet V2
model. Here’s what our preprocessing likely included:
• Resizing: Since MobileNet V2 expects input images of size 224 x 224 pixels, all leaf
images were resized to this dimension.
• Normalization: Pixel intensity values were scaled from 0–255 to a 0–1 range to
reduce computational complexity and enhance learning.

3.3.4 Model Selection:


At this point, several model architectures may have been considered for the task,
including ResNet, EfficientNetB3, MobileNetV2 and Simple CNN. But after comparing
model size, training time, accuracy, and suitability for mobile deployment, MobileNet
V2 was selected. This decision was made because:
• It is lightweight and optimized for smartphones and embedded systems.
• It retains high classification performance despite being small and fast.
• It is well-suited for models that will be eventually deployed in rural or remote
locations without high computational power.

3.3.5 MobileNet V2 Architecture:


MobileNet V2 lies at the heart of our model. It is a Convolutional Neural Network (CNN)
architecture built with efficiency in mind, designed for resource-constrained environments
such as mobile phones and tablets. Figure 3.2 shows the layer wise architecture of the model
Key features of this architecture include:
• Inverted Residual Blocks: These enable the model to carry forward important features
from shallow to deeper layers.
• Linear Bottlenecks: Reduce dimensionality without losing feature information, allowing
the model to remain lightweight.
• Depthwise Separable Convolutions: These significantly reduce computation by
breaking standard convolutions into two smaller operations.

38
Figure 3.2 – MobileNetV2 Architecture

The table 3.1 outlines the architecture of a lightweight Convolutional Neural Network,
resembling MobileNetV2. It is optimized for mobile and embedded vision applications.

Table 3.1 – Each Layer of MobileNetV2


Kernel
Type of layer Input Size Output Size Stride
Size
Initial Conv 224 × 224 × 3 112 × 112 × 32 3×3 2
Inverted Residual
112 × 112 × 32 112 × 112 × 16 3×3 1
Block
Inverted Residual
112 × 112 × 16 56 × 56 × 24 3×3 2
Block x2
Inverted Residual
56 × 56 × 24 28 × 28 × 32 3×3 2
Block x3
Inverted Residual
28 × 28 × 32 14 × 14 × 64 3×3 2
Block x4
Inverted Residual
14 × 14 × 64 14 × 14 × 96 3×3 1
Block x3
Inverted Residual
14 × 14 × 96 7 ×7 × 160 3×3 2
Block x3
Inverted Residual
7 × 7 × 160 7 × 7 ×320 3×3 1
Block x1
Final Conv 7 × 7 × 320 7 × 7 × 1280 1×1 1
Global Avg.
7 × 7 × 1280 1 × 1 × 1280
Pooling
Fully Connected 1 × 1 × 1280 1 × 1 × 1000

The model learns features in a hierarchical manner: from basic edges and textures in early
layers to complex patterns like disease shapes and spot arrangements in later layers. After
passing through the full architecture, the final softmax layer outputs probabilities for each leaf
class (e.g., healthy, bacterial spot, early blight, etc.).
39
3.3.6 Training of Data:
This phase involves feeding the preprocessed real-time images into MobileNetV2. During
training:
• The model adjusted its internal weights to minimize prediction error using
backpropagation.
• Loss functions (like categorical cross-entropy) were used to measure how far predictions
were from actual labels.
• Optimization algorithms (such as Adam) helped fine-tune these weights efficiently.
• Multiple epochs (iterations over the dataset) were conducted to ensure the model
understood deep patterns in the leaf images.
The model essentially learned to distinguish between healthy and diseased leaves, and among
different disease types, based solely on the real-time images you collected, making it highly
realistic and practical for real-world use.

3.3.7 Optimization:
After initial training, further optimization steps were carried out to prepare the model for real
deployment. These included:
• Quantization: Converting model weights to 8-bit integers to reduce size.
• Pruning: Removing redundant parameters to improve speed.
• Tuning hyperparameters such as learning rate and dropout to boost performance.

3.3.8 Deployment:
With our trained and optimized MobileNetV2 model, we’re now able to deploy it as a real-
world solution. Because our model is based on data captured by phone, it works well in the
same conditions it was trained for, giving it a strong edge over generic models trained on
curated datasets.

3.3.9 Feedback:
Finally, once the model is deployed, a feedback mechanism can be added. If the Web app or
system predicts wrongly, users can report or correct the classification. This feedback is
valuable because:
• It helps collect new disease types or unseen conditions.
• Ensures the model remains accurate and relevant over time.

40
Chapter 4
Implementation

4.1 Introduction
In this we describes how the data was collected, preprocessed, and used to train a deep learning
model, followed by deployment on a user-accessible platform. The core of the implementation
lies in leveraging the MobileNetV2 architecture for its efficiency and accuracy, combined with
the simplicity and accessibility of Streamlit for real-time web deployment.
The complete workflow from gathering field images to delivering real-time predictions has
been designed to ensure ease of use, speed, and reliability, especially catering to the needs of
farmers and agricultural experts.

4.2 Implementation Strategy


The implementation plan was carefully developed to make sure the system is not only
technically sound but also practical and user-friendly for everyday use in agricultural
environments. It starts by manually gathering images of plant leaves directly from the field,
capturing real-world conditions like natural lighting, diverse backgrounds, and genuine signs
of disease.
Once the dataset was curated and labelled, the images were pre-processed - resized,
normalized, and augmented to prepare them for model training. The deep learning model was
built using MobileNetV2, selected for its lightweight architecture, which allows for faster
inference without sacrificing too much accuracy. The Figure 4.1 showcases the end-to-end
implementation strategy of plant disease detection system.

41
Figure 4.1 – Implementation Flowchart

4.2.1 Data Collection:


Images of plant leaves were manually captured using a smartphone in real agricultural
environments, ensuring authenticity and diversity in the dataset.

4.2.2 Data Annotation:


Every image was tagged to show whether it depicted a specific disease or a healthy
condition, providing the foundation for training the model using supervised learning.

4.2.3 Dataset Formation & Preprocessing:


After annotation, the dataset was curated and prepared. Preprocessing steps like resizing,
normalization, and image augmentation were applied to improve the model’s learning
performance and generalization.
• Resize all images to 224×224 pixels for compatibility with MobileNetV2.
• Normalize pixel values to a [0, 1] range.

42
4.2.4 Model Selection and Training:
Use MobileNetV2 as the base model, leveraging its pre-trained ImageNet weights for
transfer learning. Customize the classification head by adding:
• GlobalAveragePooling2D
• Fully connected Dense layer with ReLU activation
• Dropout for regularization
• Output Dense layer with Softmax activation for multi-class classification (21
classes)

4.2.5 Deployment on Web:


Create a user-friendly web app using Streamlit that lets people upload images and get
instant predictions about potential diseases. After a diagnosis, the app should offer helpful
information about the condition along with possible treatment options.

4.3 Tools/Hardware/Software Requirements


4.3.1 Tools and Library used
● Python 3.9+ – Core Programming Language
● TensorFlow / Keras – Machine Learning Models implementation
● OpenCV – Image processing and augmentation
● Matplotlib – Visualization of data
● Streamlit – Real-time Web deployment
● NumPy and Pandas – Data Manipulation and analysis

4.3.2 Software Requirements


● Operating System – Windows 10/11
● Development Environments – VS Code, Jupyter Notebook, Anaconda Navigator
● Web Browser – Chrome / Microsoft Edge
● Package Manager – conda (Anaconda) and pip

4.3.3 Hardware Requirements


● Processor (CPU) – Intel i5 or AMD Ryzen5 minimum
● RAM – 8 GB minimum
● Storage – 10 GB free minimum

43
● GPU (optional) – NVDIA GPU with CUDA
● Mobile Camera – Any good quality mobile camera.

4.4 Expected Outcome


The expected outcomes of this research project are as follows:
4.4.1 Creation of a Well-Annotated Plant Leaf Dataset:
A structured and preprocessed dataset consisting of 21 classes of healthy and diseased
plant leaf images, categorized into biotic and abiotic types.

4.4.2 Development of a Lightweight Deep Learning Model:


A trained MobileNetV2-based convolutional neural network (CNN) optimized for plant
disease classification with an accuracy of 74.39%.

4.4.3 Real-Time Disease Detection Interface:


A responsive and interactive web-based application using Streamlit, capable of predicting
plant diseases from uploaded images in real-time.

4.4.4 Improved Accessibility and Usability:


The model and web deployment are crafted to support usage on devices with limited
resources, like smartphones or tablets or remote agricultural areas, ensuring easy adoption
by farmers and agronomists.

4.4.5 Visual and Analytical Insight:


Comprehensive performance evaluation of the model using classification reports,
confusion matrix, accuracy/loss curves, and prediction outputs for transparent and
interpretable results.

44
Chapter 5
Result & Discussion

5.1 Results Overview


The aim of this project was to increase efficiency of disease identification in common
agricultural crops through creating a MobileNetV2 based deep learning model that would
render the disease identification real time and comparatively half the size of the baseline
model. It was trained and tested and have its proficiency checked on a custom set of data
captured through smartphone and in real-life field scenario. Yielding an accuracy of 74.39%
in the final trained model.

5.2 Model Performance Evaluation


5.2.1 Accuracy
Its test accuracy of 74.39% is reasonable enough to prove that the model worked fairly
well in classifying the data. This metric is the accuracy of the features from 21 categories
of diseases and healthy states in mango, potato and eggplant plants. This is particularly
so since the analysis was performed on the dataset that was captured from real-world
environment with varying lighting, cluttered background and natural noise. Figure 5.1
presents the accuracy graph along with no. of epochs.

Figure 5.1 – Accuracy Graph

45
5.2.2 Confusion Matrix Analysis
Performance of individual classes was evaluated by using a multiclass confusion matrix
approach. This showed that although features such as Potato Healthy and Mango
Powdery Mildew were classified with high accuracy, the ones like Eggplant Wilt and
Mango Anthracnose had comparatively higher misclassification rates. Some of these
confusions were aggravated by the fact that at times the symptoms may be very similar;
the disease might manifest in the form of leaf spots or wilting.
Figure 5.2 presents the confusion matrix, which highlights how well the plant disease
classification model performed, achieving an overall accuracy of 74%. The matrix
represents four categories: true positives (bottom-right), true negatives (top-left), false
positives (bottom-left), and false negatives (top-right). In this case, the model correctly
classified 370 diseased leaf images as diseased and 370 healthy leaf images as healthy.
However, it also misclassified 130 healthy leaves as diseased and 130 diseased leaves as
healthy.
This performance suggests that the model can generally tell the difference between
healthy and diseased samples, but there's still some confusion in a few cases, possibly
due to visual similarities between classes or limited image variety. Nevertheless, the
balanced confusion matrix suggests that the model is not biased toward any one class,
and further improvements can be achieved through dataset expansion or fine-tuning the
model architecture.

Figure 5.2 – Confusion Matrix

46
5.2.3 Precision, Recall, and F1-Score
The following average metrics were recorded across all classes:
• Precision: 0.75
• Recall: 0.73
• F1-Score: 0.74
These values indicate a fairly balanced model, with no significant overfitting or
underfitting on any particular class.
5.3 Visual Results and Web Application Output
5.3.1 Interface Overview
• Home Page: Describes the app’s purpose, usage instructions, and supported crops.

Figure 5.3 – Home Page Output


• About Page: Explains the technologies used including Python, TensorFlow,
Streamlit, and OpenCV.

Figure 5.4 – About Page Output


47
• Disease Detection Page: Allows real-time image upload and returns a classification
result with confidence percentage.

Figure 5.5 – Detect Page Output

5.4 Discussion
The development and testing of the proposed plant disease identification system have yielded
insightful outcomes that reflect both the promise and the limitations of machine learning (ML)
in real-world agricultural applications. With a final model accuracy of 74.39%, achieved using
MobileNetV2 on a custom, real-time dataset, this project presents a significant stride in
bridging the gap between theoretical deep learning research and practical deployment for
farmers. This section delves deeply into the interpretation of results, real-world relevance,
challenges, and future opportunities based on experimental findings.

5.4.1 Realistic Accuracy


Many advanced machine learning models designed to detect plant diseases have been
shown to achieve accuracy rates between 90% and 99%. However, these are often
obtained using clean, high-resolution images from controlled datasets such as
PlantVillage, captured in uniform lighting and background conditions. In contrast, this
project emphasizes real-world usability over artificial benchmarks.

Our dataset, gathered through smartphones in actual field conditions, contains images
with varied lighting, backgrounds, and angles simulating the way farmers would capture
images. Thus it resulting 74.39% accuracy.
48
5.4.2 Environmental Noise
The model’s performance was impacted by natural noise in field photography. This
includes:
• Inconsistent lighting (e.g., shadowed vs. sunlit leaves)
• Unclear backgrounds (soil, weeds, debris)
• Partial leaves or occlusion (only half the leaf visible)
• Blurred images due to motion or camera shake
These artifacts affected feature extraction, as MobileNetV2 depends heavily on edge
patterns and texture granularity to classify. Yet, this reinforces the project's core
principle: building for reality, not ideal lab conditions.

5.4.3 Positioning against State of-the-Art solutions


When compared with other studies in the literature:
• MobileNetV2 (74%) underperforms CNN-based systems on ideal datasets (90–
99%)
• However, it outperforms heavy models when applied to noisy, real-world data
without GPU reliance
• It is far more practical for rural deployment, requiring no additional hardware
Hence, the project prioritizes "Good Enough, Everywhere" over "Perfect, Nowhere" a
much-needed shift in ML for agriculture

5.4.4 Deployment
One of the project’s biggest wins is its deploy ability:
• The trained model size was reduced to under 14MB, making it viable for mobile
devices.
• Inference time was under 2 seconds per image on mid-tier smartphones.
• The model is integrated into a Streamlit web app with a simple interface,
multilingual support, and actionable output.
Compared to complex models like ResNet50 or InceptionV3, which require GPU
resources and extensive RAM, this system can be used in rural, low-bandwidth, and
offline settings, fulfilling its intended use case.

49
Chapter 6
Conclusion & Future Scope

6.1 Conclusion

This project showcases a practical approach to detecting plant diseases in real time using
images and the MobileNetV2 deep learning model. It centers on four important crops mango,
potato, spinach, and eggplant and is trained to identify 21 different categories of healthy and
diseased leaves. What makes this work stand out is the use of a custom dataset, carefully
collected with a smartphone directly from actual farms, making the system more applicable
and reliable in real-world agricultural settings.

MobileNetV2 model with a light model size of 3.5 MB tested the model with an accuracy of
74.39% it seems to be a quite good performance for solving multi-class classification problems
with moderate computation burden. The model was implemented as a Streamlit web
application where users can input pictures and get a report of the diseases in them along with
voice and text recommendations for treating them. This not only makes the system technically
sound and realistic, but also feasible to farmers with limited resources in rural areas.

Overall, the project bridges the gap between machine learning technology and the agricultural
sector, offering a scalable, cost-effective solution for early disease detection and better crop
management

6.2 Future Scope

While the current system lays a strong foundation, there are several opportunities to extend and
improve its functionality:
6.2.1 Expansion to More Crops and Diseases:
The model can be extended to cover additional crops and a wider variety of diseases,
increasing its relevance and utility across different regions.

50
6.2.2 Integration with Mobile Applications:
Creating a dedicated Android/iOS app would allow offline usage, enabling farmers in
remote areas with limited internet connectivity to benefit from the system.
6.2.3 Real-Time Camera Capture and Diagnosis:
Incorporating live image capture using a mobile device camera for direct, on-spot
diagnosis would enhance usability and convenience.
6.2.4 Multi-Language and Regional Voice Support:
Expanding voice output to include more regional languages and dialects can further
improve accessibility for diverse user groups.
6.2.5 Feedback Loop and Retraining:
Implementing a system where users can give feedback on predictions will allow the
model to improve over time through retraining on new data.
6.2.6 Cloud-Based Monitoring Dashboard:
A centralized dashboard for agricultural officers to monitor disease spread trends
across regions can be developed to aid in larger-scale agricultural planning and
disease control.

51
References

1. Food and Agriculture Organization of the United Nations (FAO). (2021). The State of Food
and Agriculture.
2. Strange, R. N., & Scott, P. R. (2005). Plant disease: a threat to global food security.
Philosophical Transactions of the Royal Society B: Biological Sciences, 360(1464), 1251–
1262.
3. Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., & Chen, L. C. (2018). MobileNetV2:
Inverted residuals and linear bottlenecks. In Proceedings of the IEEE Conference on
Computer Vision and Pattern Recognition (CVPR), 4510–4520.
4. Too, E. C., Yujian, L., Njuki, S., & Yingchun, L. (2019). A comparative study of fine-tuning
deep learning models for plant disease identification. Computers and Electronics in
Agriculture, 161, 272–279.
5. Upadhyay, A., Chandel, N. S., Singh, K. P., Chakraborty, S. K., Nandede, B. M., Kumar, M.,
& Elbeltagi, A. (2025). Deep learning and computer vision in plant disease detection: a
comprehensive review of techniques, models, and trends in precision agriculture. Artificial
Intelligence Review, 58(3), 1-64.
6. Balasundaram, A., Sundaresan, P., Bhavsar, A., Mattu, M., Kavitha, M. S., & Shaik, A.
(2025). Tea leaf disease detection using segment anything model and deep convolutional
neural networks. Results in Engineering, 25, 103784..
7. Dolatabadian, A., Neik, T. X., Danilevicz, M. F., Upadhyaya, S. R., Batley, J., & Edwards,
D. (2025). Image‐based crop disease detection using machine learning. Plant Pathology,
74(1), 18-38.
8. Kulkarni, P., & Shastri, S. (2024). Rice leaf diseases detection using machine learning.
Journal of Scientific Research and Technology, 17-22.
9. Hamed, B. S., Hussein, M. M., & Mousa, A. M. (2023). Plant Disease Detection Using Deep
Learning. Int. J. Intell. Syst. Appl, 15, 38-50.
10. Joseph, D. S., Pawar, P. M., & Chakradeo, K. (2024). Real-time plant disease dataset
development and detection of plant disease using deep learning. Ieee Access, 12, 16310-
16333.
11. Moupojou, E., Tagne, A., Retraint, F., Tadonkemwa, A., Wilfried, D., Tapamo, H., &
Nkenlifack, M. (2023). FieldPlant: A dataset of field plant images for plant disease detection
and classification with deep learning. IEEE Access, 11, 35398-35410.
52
12. Khan, F., Zafar, N., Tahir, M. N., Aqib, M., Waheed, H., & Haroon, Z. (2023). A mobile-
based system for maize plant leaf disease detection and classification using deep learning.
Frontiers in Plant Science, 14, 1079366.
13. Kotwal, J., Kashyap, R., & Pathan, S. (2023). Agricultural plant diseases identification: From
traditional approach to deep learning. Materials Today: Proceedings, 80, 344-356.
14. C. Md. Jalaluddin & Al-Tuwaijari, J. M. (2023, April). Plant leaf diseases detection and
classification using image processing and deep learning techniques. In 2020 International
Conference on Computer Science and Software Engineering (CSASE) (pp. 259-265). IEEE.
15. Shoaib, M., Shah, B., Ei-Sappagh, S., Ali, A., Ullah, A., Alenezi, F., ... & Ali, F. (2023). An
advanced deep learning models-based plant disease detection: A review of recent research.
Frontiers in Plant Science, 14, 1158933.
16. Shahi, T. B., Xu, C. Y., Neupane, A., & Guo, W. (2023). Recent advances in crop disease
detection using UAV and deep learning techniques. Remote Sensing, 15(9), 2450.
17. Jackulin, C., & Murugavalli, S. J. M. S. (2022). A comprehensive review on detection of
plant disease using machine learning and deep learning approaches. Measurement: Sensors,
24, 100441.
18. Pandian, J. A., Kumar, V. D., Geman, O., Hnatiuc, M., Arif, M., & Kanchanadevi, K. (2022).
Plant disease detection using deep convolutional neural network. Applied Sciences, 12(14),
6982.
19. Padhi, J., Mishra, K., Ratha, A. K., Behera, S. K., Sethy, P. K., & Nanthaamornphong, A.
(2025). Enhancing Paddy Leaf Disease Diagnosis-a Hybrid CNN Model using Simulated
Thermal Imaging. Smart Agricultural Technology, 100814.
20. Mahum, R., Munir, H., Mughal, Z. U. N., Awais, M., Sher Khan, F., Saqlain, M., ... & Tlili,
I. (2023). A novel framework for potato leaf disease detection using an efficient deep learning
model. Human and Ecological Risk Assessment: An International Journal, 29(2), 303-326.
21. Harakannanavar, S. S., Rudagi, J. M., Puranikmath, V. I., Siddiqua, A., & Pramodhini, R.
(2022). Plant leaf disease detection using computer vision and machine learning algorithms.
Global Transitions Proceedings, 3(1), 305-310
22. Ahmed, A. A., & Reddy, G. H. (2021). A mobile-based system for detecting plant leaf
diseases using deep learning. AgriEngineering, 3(3), 478-493.
23. Kowshik, B., Savitha, V., Karpagam, G., & Sangeetha, K. (2021). Plant disease detection
using deep learning. International Research Journal on Advanced Science Hub, 3(3S), 30-33.
24. Li, L., Zhang, S., & Wang, B. (2021). Plant disease detection and classification by deep
learning a review. IEEE Access, 9, 56683-56698.
53
25. Sharma, R., Singh, A., Jhanjhi, N. Z., Masud, M., Jaha, E. S., & Verma, S. (2022). Plant
Disease Diagnosis and Image Classification Using Deep Learning. Computers, Materials &
Continua, 71(2).
26. S. Prabhakar (2021). Plant disease identification using Deep Learning: A review. The Indian
Journal of Agricultural Sciences, 90(2), 249-257.
27. Tiwari, D., Ashish, M., Gangwar, N., Sharma, A., Patel, S., & Bhardwaj, S. (2020, May).
Potato leaf diseases detection using deep learning. In 2020 4th international conference on
intelligent computing and control systems (ICICCS) (pp. 461-466). IEEE.
28. Panigrahi, K. P., Das, H., Sahoo, A. K., & Moharana, S. C. (2020). Maize leaf disease
detection and classification using machine learning algorithms. In Progress in Computing,
Analytics and Networking: Proceedings of ICCAN 2019 (pp. 659-669). Springer Singapore.
29. Mindhe, O., Kurkute, O., Naxikar, S., & Raje, N. (2020). Plant disease detection using deep
learning. International Research Journal of Engineering and Technology, 2497-2503.
30. Venkataramanan, A., Honakeri, D. K. P., & Agarwal, P. (2019). Plant disease detection and
classification using deep neural networks. Int. J. Comput. Sci. Eng, 11(9), 40-46
31. Ahmed, K., Shahidi, T. R., Alam, S. M. I., & Momen, S. (2019, December). Rice leaf disease
detection using machine learning techniques. In 2019 International Conference on
Sustainable Technologies for Industry 4.0 (STI) (pp. 1-5). IEEE.
32. Türkoğlu, M., & Hanbay, D. (2019). Plant disease and pest detection using deep learning-
based features. Turkish Journal of Electrical Engineering and Computer Sciences, 27(3),
1636-1651.
33. Saleem, M. H., Potgieter, J., & Arif, K. M. (2019). Plant disease detection and classification
by deep learning. Plants, 8(11), 468.
34. Militante, S. V., Gerardo, B. D., & Dionisio, N. V. (2019, October). Plant leaf detection and
disease recognition using deep learning. In 2019 IEEE Eurasia conference on IOT,
communication and engineering (ECICE) (pp. 579-582). IEEE.
35. Nigam, S., & Jain, R. (2020). Plant disease identification using Deep Learning: A review.
The Indian Journal of Agricultural Sciences, 90(2), 249-257
36. Dhakal, A., & Shakya, S. (2018). Image-based plant disease detection with deep learning.
International Journal of Computer Trends and Technology, 61(1), 26-29.
37. Barbedo, J. G. (2018). Factors influencing the use of deep learning for plant disease
recognition. Biosystems engineering, 172, 84-91.

54
38. Akila, M., & Deepan, P. (2018). Detection and classification of plant leaf diseases by using
deep learning algorithm. International Journal of Engineering Research & Technology
(IJERT), 6(7), 1-5.

55
major final report - ai plag [Link]
ORIGINALITY REPORT

12 %
SIMILARITY INDEX
7%
INTERNET SOURCES
6%
PUBLICATIONS
5%
STUDENT PAPERS

PRIMARY SOURCES

1
Submitted to GL Bajaj Institute of Technology
and Management
4%
Student Paper

2
M.K. Rana. "Vegetable Crops Science", CRC
Press, 2017
1%
Publication

3
"Handbook of Florists' Crops Diseases",
Springer Science and Business Media LLC,
<1%
2018
Publication

4
[Link]
Internet Source <1%
5
Amar Bahadur, Pranab Dutta. "Diseases of Oil
Crops and Their Integrated Management",
<1%
CRC Press, 2023
Publication

6
Wubetu Barud Demilie. "Plant disease
detection and classification techniques: a
<1%
comparative study of the performances",
Journal of Big Data, 2024
Publication

7
Diseases of Fruits and Vegetables Volume I,
2004.
<1%
Publication

8
Arvind Dagur, Karan Singh, Pawan Singh
Mehra, Dhirendra Kumar Shukla. "Intelligent
<1%
Computing and Communication Techniques -
Volume 3", CRC Press, 2025
Publication
9
[Link]
Internet Source <1%
10
[Link]
Internet Source <1%
11
Field Crop Diseases Handbook, 1989.
Publication <1%
12
Lluís Palou, Joseph L. Smilanick. "Postharvest
Pathology of Fresh Horticultural Produce",
<1%
Routledge, 2019
Publication

13
"The Sweetpotato", Springer Science and
Business Media LLC, 2009
<1%
Publication

14
Faiza Khan, Noureen Zafar, Muhammad
Naveed Tahir, Muhammad Aqib, Hamna
<1%
Waheed, Zainab Haroon. "A mobile-based
system for maize plant leaf disease detection
and classification using deep learning",
Frontiers in Plant Science, 2023
Publication

15
[Link]
Internet Source <1%
16
Gireesh Chand, Nadeem Akhtar, Santosh
Kumar. "Diseases of Fruits and Vegetable
<1%
Crops - Recent Management Approaches",
CRC Press, 2020
Publication

17
[Link]
Internet Source <1%
18
[Link]
Internet Source <1%
19
Mohammed A. Asham, Asma A. Al-Shargabi,
Raeed Al-Sabri, Ibrahim Meftah. "A lightweight
<1%
deep learning model with knowledge
distillation for pulmonary diseases detection
in chest X-rays", Multimedia Tools and
Applications, 2024
Publication

20
Submitted to University of Exeter
Student Paper <1%
21
"Genomic Designing for Biotic Stress Resistant
Vegetable Crops", Springer Science and
<1%
Business Media LLC, 2022
Publication

22
Submitted to The Robert Gordon University
Student Paper <1%
23
Submitted to Middlesex University
Student Paper <1%
24
Rabbia Mahum, Haris Munir, Zaib-Un-Nisa
Mughal, Muhammad Awais et al. "A novel
<1%
framework for potato leaf disease detection
using an efficient deep learning model",
Human and Ecological Risk Assessment: An
International Journal, 2022
Publication

25
Submitted to University of Sheffield
Student Paper <1%
26
Submitted to Liverpool John Moores
University
<1%
Student Paper

27
Submitted to Southern New Hampshire
University - Continuing Education
<1%
Student Paper

28
V Suma, R Amog Shetty, Rishab F Tated,
Sunku Rohan, Triveni S Pujar. "CNN based
<1%
Leaf Disease Identification and Remedy
Recommendation System", 2019 3rd
International conference on Electronics,
Communication and Aerospace Technology
(ICECA), 2019
Publication

29
[Link]
Internet Source <1%
30
[Link]
Internet Source <1%
31
[Link]
Internet Source <1%
32
Submitted to VIT University
Student Paper <1%
33
[Link]
Internet Source <1%
34
[Link]
Internet Source <1%
35
Dinesh Singh, Ram Roshan Sharma, V.
Devappa, Deeba Kamil. "Postharvest Handling
<1%
and Diseases of Horticultural Produce", CRC
Press, 2021
Publication

36
[Link]
Internet Source <1%
37
[Link]
Internet Source <1%
38
[Link]
Internet Source <1%
39
[Link]
Internet Source <1%
40
[Link]
Internet Source <1%
41
[Link]
Internet Source <1%
42
Muhammad Sarwar Khan, Iqrar Ahmad Khan,
Debmalya Barh. "Applied Molecular
<1%
Biotechnology - The Next Generation of
Genetic Engineering", CRC Press, 2019
Publication

43
[Link]
Internet Source <1%
44
[Link]
Internet Source <1%
45
[Link]
Internet Source <1%
46
[Link]
Internet Source <1%
47
[Link]
Internet Source <1%
48
[Link]
Internet Source <1%
49
[Link]
Internet Source <1%
50
Kanlayanee Kaweesinsakul, Siranee
Nuchitprasitchai, Joshua Pearce. "Open
<1%
source disease analysis system of cactus by
artificial intelligence and image processing",
The 12th International Conference on
Advances in Information Technology, 2021
Publication

51
Lavika Goel, Jyoti Nagpal. "A Systematic
Review of Recent Machine Learning
<1%
Techniques for Plant Disease Identification
and Classification", IETE Technical Review,
2022
Publication

52
P.C. Struik, S.G. Wiersema. "Seed potato
technology", Brill, 1999
<1%
Publication

53
Rajasekaran Thangaraj, S. Anandamurugan, P
Pandiyan, Vishnu Kumar Kaliappan. "Artificial
<1%
intelligence in tomato leaf disease detection:
a comprehensive review and discussion",
Journal of Plant Diseases and Protection, 2021
Publication

54
Zeynep Ünal, Hakan Aktaş. "Classification of
hazelnut kernels with deep learning",
<1%
Postharvest Biology and Technology, 2023
Publication

55
[Link]
Internet Source <1%
56
[Link]
Internet Source <1%
57
[Link]
Internet Source <1%
58
R. K. Singh, Gopala. "Innovative Approaches in
Diagnosis and Management of Crop Diseases
<1%
- Field and Horticultural Crops", Apple
Academic Press, 2021
Publication

59
Shalli Rani, Ayush Dogra, Ashu Taneja. "Smart
Computing and Communication for
<1%
Sustainable Convergence", CRC Press, 2025
Publication

60
M. J. Foxe. "Breeding for viral resistance:
conventional methods", Netherlands Journal
<1%
of Plant Pathology, 1992
Publication

Exclude quotes Off Exclude matches Off


Exclude bibliography Off

You might also like