CHAPTER THREE: METHODOLOGY
3.1 Introduction
This chapter describes the research and development methodology used to design, implement,
and evaluate an online voting system for the Kenyatta University Students' Association (KUSA)
that is secure, efficient, and accessible. The chapter details the development framework,
techniques for data collection, tools and processes for data analysis, implementation and testing
tools, time schedule, and project costs, ensuring rigor and feasibility within solo constraints
(Hevner & Chatterjee, 2010).
3.2 Development Framework
The study will adopt the System Development Life Cycle (SDLC) as the main framework for
developing the voting system, ensuring a structured and orderly process from planning and
requirements analysis to design, implementation, testing, deployment, and maintenance. To
improve flexibility, Agile development principles will be integrated during implementation and
testing, allowing iterative development in short sprints, regular stakeholder feedback, continuous
testing, and timely adaptation to emerging requirements. This combined approach will reduce
risks and support the development of a secure, reliable, and user-centered voting system.
3.3 Research Design
The study will adopt a mixed-methods research design, combining both quantitative and
qualitative approaches to better understand the requirements and effectiveness of the voting
system. Quantitative data will be collected using structured questionnaires to measure system
stability, efficiency, and performance. This data will provide numerical results that will be
analyzed to evaluate how well the system meets user needs.
Qualitative data will be collected through interviews with selected users and administrators to
gather detailed information on their experiences, opinions, and expectations regarding the voting
system. This approach will help capture insights that could not be obtained through
questionnaires alone.
By using both methods, the study will able to obtain a more complete and reliable understanding
of user requirements and system performance. The combined findings will inform the design,
development, and evaluation of the voting system.
3.4 Techniques for Data Collection
A combination of primary and secondary data collection techniques will be utilized.
8
3.4.1 Quantitative Data Collection
Online Surveys: Structured questionnaires are administered via Google Forms to assess
user requirements and adoption readiness.
Performance Metrics: Automated logs capture data during pilot testing, including vote
submission times and system latency, involving 100 simulated instances.
Observation Checklists: Structured observation during system testing.
System Usability Scale (SUS): Used to measure perceived usability of the system using a
standardized questionnaire.
3.4.2 Qualitative Data Collection
Semi-Structured Interviews: One-on-one interviews with users and administrators.
Focus Group Discussions (FGDs): Small group discussions to explore shared experiences
for example, Virtual groups of 5–8 KUSA officials conducted via Zoom to gather
administrative insights.
3.4.3 Secondary Data
A review of literature, including the Estonian e-voting case study, benchmarks the system
against best practices.
3.5 Tools and Processes for Data Analysis
3.5.1 Quantitative Data Analysis
Quantitative data collected through questionnaires and system testing will be analyzed using
statistical tools like Microsoft Excel. The data will be first checked to ensure it is complete and
accurate. After this, the data will be organized and analyzed.
Descriptive statistics such as frequencies, percentages, and mean scores will be used to analyze
the data. The results will then be presented using tables and charts to make them easy to
understand. The findings were used to evaluate the usability and performance of the voting
system.
3.5.2 Qualitative Data Analysis
Qualitative data collected through interviews and open-ended questions will be analyzed using
thematic analysis. The data will be first reviewed carefully to understand the responses. Similar
responses will then be grouped into common themes such as usability, security, and user
satisfaction.
Key ideas and opinions from participants will identified and summarized. These themes will help
explain user experiences and supported the findings from the quantitative data. The results of the
9
qualitative analysis will be presented in descriptive form, using brief explanations and examples
where necessary.
3.5.3 Mixed-Methods Integration
Mixed-methods integration will be done through triangulation, where quantitative and qualitative
results will be combined. SUS (System Usability Scale) scores from questionnaires will be
compared with interview responses to help explain the results. This will make it easier to
understand the reasons behind high or low usability scores based on user experiences.
Combining both data types will improve the accuracy of the study findings.
3.5.4 Validity and Reliability Measures
To ensure the validity and reliability of the study findings, appropriate analytical, usability, and
security evaluation tools will be applied throughout the system development and evaluation
process.
Category Tool/Method Purpose
Data cleaning and
Statistical Analysis Microsoft Excel
quantitative analysis
Qualitative Analysis NVivo/Manual Thematic analysis of
interview data
Security Testing Identification of system
OWASP ZAP
vulnerabilities.
Usability Evaluation System Usability Scale (SUS) Measurement of system
efficiency and usability
Methodological Basis Vaishnavi & Kuechler (2015) Ensuring research validity
3.6 Tools for System Implementation and Testing
This section describes the tools that will be used to develop and test the voting system to ensure
correct functionality and system reliability.
3.6.1 System Implementation Tools
Programming Language - Used to develop system logic and functionalities (Python,
JavaScript)
Web Framework - Used to structure and speed up system development (React for the
frontend and [Link] (Express) for the backend).
10
Database Management System (DBMS) - Used to store and manage voting data securely
(MySQL or MongoDB)
Web Server - Hosts and runs the voting system (Apache or [Link] built-in server)
Development Environment - Code editors and IDEs used for system development (Visual
Studio Code)
Version Control System - Used to manage source code and track changes (Git, GitHub)
3.6.2 System Testing Tools
Tools Used for System Testing
JUnit / TestNG
Will be used for unit testing of backend modules to ensure that individual functions and
classes are working properly.
Postman
Will be used for integration and functional testing of APIs to ensure proper data transfer
between system modules.
Selenium WebDriver
Will be used for functional testing and browser compatibility testing to ensure that the
system is working properly across different web browsers like Chrome, Firefox, and
Edge.
JMeter
Will be used for performance and integration testing to determine the response time and
reliability of the system under different loads.
Manual Testing Tools (Browser Developer Tools)
Browser developer tools will be used for manual testing, debugging, and detecting
UI/runtime errors during system execution.
User Acceptance Testing (UAT) Checklists & Feedback Forms
Will be used during UAT to obtain user feedback and ensure that the system satisfies user
requirements.
3.7 Ethical Considerations and Limitations
Ethical considerations will be observed throughout the study by ensuring informed consent,
protecting participant privacy, maintaining confidentiality, and securely handling all collected
data, which will be used strictly for academic and system development purposes. Participants’
identities will be kept anonymous, and participation will be voluntary. However, the study might
have some limitations, which may include a limited sample size, time and resource constraints,
and differences in participants’ levels of technological familiarity, which may influence the
findings.
11
3.8 Time Schedule
The project timeline will be planned over a six-month period, covering planning, development,
testing, data analysis, and final submission phases. This personally managed schedule will allow
flexibility for delays while maintaining momentum.
Figure 3.1: Project time schedule (Gantt chart) for the online voting system development.
3.9 Project Budget
This breakdown covers the estimated costs that will be involved in the development and
evaluation of the system.
Category Item Estimated Cost (KES)
Hosting Cloud Infrastructure 12,000
(AWS/Azure)
Security SSL Certificates & Domain 8,000
Registration
Field Work Participant Incentives (Pilot 15,000
Users)
Materials Software Licensing & API 10,000
Access
Production Printing, Binding, and 5,000
Distribution
Total 50,000
12
This will ensure feasibility without external funding.
References
Books
Hevner, A. R., & Chatterjee, S. (2010). Design research in information systems: Theory and
practice. Springer. [Link]
Vaishnavi, V. K., & Kuechler, W. (2015). Design science research methods and patterns:
Innovating information and communication technology (2nd ed.). CRC Press.
[Link]
Wieringa, R. J. (2014). Design science methodology for information systems and software
engineering. Springer. [Link]
Journals
Alqadami, S. A., Memon, Z. A., & Anjum, A. (2023). Agile software development and reuse
approach with Scrum and software product line engineering. Electronics, 12(15), Article 3291.
[Link]
Hron, M., & Obwegeser, N. (2022). Why and how is Scrum being adapted in practice: A
systematic review. Journal of Systems and Software, 183, Article 111110.
[Link]
Kariryaa, A., Müller-Birn, C., & Beck, F. (2022). How Scrum adds value to achieving software
quality? Empirical Software Engineering, 27(7), Article 165. [Link]
022-10208-4
Park, S., Specter, M., Narula, N., & Rivest, R. L. (2021). Going from bad to worse: From
Internet voting to blockchain voting. Journal of Cybersecurity, 7(1), Article tyaa025.
[Link]
URLs
Django. (n.d.). Django documentation. Retrieved from [Link]
OWASP. (n.d.). OWASP ZAP: Open source proxy. Retrieved from [Link]
project-zap/
React. (n.d.). React: A JavaScript library for building user interfaces. Retrieved from
[Link]
13
Sutherland, J., & Schwaber, K. (2020). The 2020 Scrum guide. Retrieved from
[Link]
14