Augmented Reality
Augmented Reality
Article
Augmented Reality-Based Real-Time Visualization for Structural
Modal Identification
Elliott Carter 1 , Micheal Sakr 2 and Ayan Sadhu 3, *
Abstract: In the era of aging civil infrastructure and growing concerns about rapid structural de-
terioration due to climate change, the demand for real-time structural health monitoring (SHM)
techniques has been predominant worldwide. Traditional SHM methods face challenges, including
delays in processing acquired data from large structures, time-intensive dense instrumentation, and
visualization of real-time structural information. To address these issues, this paper develops a
novel real-time visualization method using Augmented Reality (AR) to enhance vibration-based
onsite structural inspections. The proposed approach presents a visualization system designed for
real-time fieldwork, enabling detailed multi-sensor analyses within the immersive environment of AR.
Leveraging the remote connectivity of the AR device, real-time communication is established with an
external database and Python library through a web server, expanding the analytical capabilities of
data acquisition, and data processing, such as modal identification, and the resulting visualization of
SHM information. The proposed system allows live visualization of time-domain, frequency-domain,
and system identification information through AR. This paper provides an overview of the proposed
technology and presents the results of a lab-scale experimental model. It is concluded that the pro-
posed approach yields accurate processing of real-time data and visualization of system identification
information by highlighting its potential to enhance efficiency and safety in SHM by integrating AR
technology with real-world fieldwork.
Citation: Carter, E.; Sakr, M.; Sadhu, Keywords: augmented reality; structural health monitoring; real-time monitoring; modal analysis;
A. Augmented Reality-Based system identification; application development
Real-Time Visualization for Structural
Modal Identification. Sensors 2024, 24,
1609. [Link]
s24051609 1. Introduction
Academic Editor: Shirsendu Sikdar 1.1. Motivation
As more of the world’s critical civil infrastructure nears the end of its usable lifespan,
Received: 8 February 2024
the urgency of rapid and real-time structural health monitoring (SHM) techniques becomes
Revised: 26 February 2024
increasingly apparent. The American Infrastructure Report Card 2021 (ASCE 2021) high-
Accepted: 26 February 2024
Published: 1 March 2024
lighted the importance of protecting aging civil infrastructure, revealing that one-third of
America’s infrastructure is vulnerable to rapid deterioration. Over the last several years,
there has been significant development in SHM technologies to undertake robust, rapid,
and remote structural inspection [1–8]. However, traditional SHM techniques face signifi-
Copyright: © 2024 by the authors. cant implementation challenges under adverse weather and busy operational conditions
Licensee MDPI, Basel, Switzerland. worldwide. These challenges include prolonged structural closures, dense instrumentation,
This article is an open access article the need to capture data from inaccessible areas, and the absence of synchronous data
distributed under the terms and acquisition methods. Consequently, data collected by field inspectors or maintenance
conditions of the Creative Commons engineers during structural inspections often remains unprocessed for several days due to
Attribution (CC BY) license (https://
their presence in bulk and dense formats [9]. Furthermore, given the increasing prevalence
[Link]/licenses/by/
of aging infrastructure worldwide and their spatial nature, the delayed processing of data
4.0/).
original position, allowing researchers to measure the distance between the original and
current positions of the object. Ref. [23] developed an inspection methodology using an AR
headset that quantitatively documented irregularly shaped sidewalks, measured the length
and geometry of cracks in concrete, and proposed a framework for developing AR-based
applications for structural inspection.
Ref. [24] deployed an application using Unity to identify the type of damage on con-
crete surfaces using an AR interface. The framework also enabled geometry measurements
of the cracked surfaces, leveraging inspection in hard-to-reach locations. An ML model
was deployed to the cloud until it was called by Unity for app deployment in the AR
glasses. Ref. [25] explored the possibility of connecting infrared thermography with AR to
enhance the visualization of SHM information. With Unity, a primary tool in developing
HL applications, a user interface was developed using the main camera, allowing users
to look around their virtual environment and thermal views were shown as 3D objects in
the virtual Unity environment. Ref. [26] developed a computer vision algorithm combined
with AR to localize fatigue cracks in structures that might easily be overlooked due to
their small size. The result of this approach led to an AR-based automatic fatigue crack
detection and localization approach that can provide holograms overlaid during onsite
visual inspections with near-real-time results.
mechanical equipment using a DT and IoT. Strain sensors connected to an Arduino Mega
transfer data via Wi-Fi to an IoT server. An in-built toolbox carried the analysis online
until the AR glasses requested the visualization of the information when a QR code was
scanned. Ref. [34] created a DT system to maintain a bridge using AR technology. The users
integrated static data from inspections, historical data, and damage records with dynamic
data from real-time sensor monitoring to create the DT. Azure IoT hub and Azure SignIR
service were used to connect the dynamic data with BIM, while Unity 3D was used to link
the DT with AR technology.
1.5. Development of AR in Operation and Maintenance via Visualization without Using Any
Information Models
The translation of information models, however, into an AR platform demands high
computational processing, a characteristic that is absent in portable AR devices such as
HMDs. Autonomously self-updating a model with densely cumulated information about
a large infrastructure’s components remains a major obstacle for real-time monitoring.
Therefore, utilizing a 3D virtual model is disregarded in this study to fully comprehend
the real-time monitoring process without jeopardizing computation power. Along this
line of study of disregarding 3D modeling, ref. [35] proposed a method based on Virtual
Tours for representing SHM data. The users documented their structure using connected
spherical panoramas to facilitate 3D comprehension. The relevant data in the form of
images, text, and files were linked to the environment and accessed by the user through
an interactive, user-friendly interface. Images created the 3D representation instead of
3D models that require high computation and storage requirements. Ref. [36] integrated
image-based documentation and AR for visualizing SHM data in the built environment.
The methodology captured an image, enabled annotations on the target’s point cloud, and
projected the annotations back into the 2D image for off-site and onsite inspection and
viewing. Ref. [37] developed a connection between AR and a wireless strain sensor. The
study established an IoT sensor that fed data to a MySQL server. The data were linked to an
AR device where real-time data was visualized over the physical target. In another study,
ref. [38] developed a human-centered interface that provides workers with real-time access
to structural data during inspection and monitoring through an AR environment. The
interface displayed real-time experiment data of maximum displacements, displacement
time histories, and easy access to manipulation of the experiment data. In those studies,
however, no form of raw data processing or modal analysis was achieved within AR for
real-time diagnostics of the acquired data. Other studies, such as [39], utilized domain
adaptation to synchronize and convert the damage-sensitive natural frequencies into a
unified feature domain. A k-nearest neighbors model was then trained to understand the
health status of the original dataset and subsequently anticipate forthcoming instances
obtained from continuous monitoring. Nevertheless, its implementation with AR and
real-time visualization is not addressed. Therefore, this paper proposes a novel and
comprehensive methodology for visualizing real-time SHM data, including time-domain,
frequency-domain, and system identification information such as natural frequencies and
mode shape of a dynamical system using AR.
The paper is organized as follows: First, a brief background of the selected AR device,
the Microsoft HoloLens 2 (HL2) is provided. Then, the proposed AR methodology is
presented, including the overall workflow of the system, the application development,
the web server communications, and the means of displaying experiment data in real-
time through the HL2. The results of the data visualization are presented next using a
lab-scale experimental model. Finally, the conclusion section summarizes the key findings,
limitations, and future work of the proposed research.
Figure 1.
Figure 1. The
The basic
basic hardware
hardware of
of Microsoft
Microsoft HoloLens
HoloLens 2.
2.
3.
3. The Proposed AR-Based SHM Methodology
This proposesnew
This research proposes newadvancements
advancements inin using
using ARAR to diagnose
to diagnose structural
structural in-
inspec-
spection
tion datadata in real-time.
in real-time. The proposed
The proposed approach
approach consists
consists of threeof steps.
three steps.
The firstThe first
step step
begins
begins with developing
with developing an AR an AR application
application in the engine
in the Unity Unity engine that provides
that provides usersan
users with with an
intui-
intuitive platform for reading and interpreting experiment data retrieved
tive platform for reading and interpreting experiment data retrieved from a database. Sec- from a database.
Secondly, a server
ondly, a server is is
setset
upuptotohost
hostaadatabase
databasefor for storing
storing experiment
experiment datadata and
and managing
managing
external
external web
web requests.
requests. During
During experiments,
experiments, data data transmission
transmission between
between the the server
server andand
HL2
HL2 isis facilitated
facilitated through
through HTTPHTTP requests.
requests. Thirdly,
Thirdly, thethe AR
AR application
application runs
runs onon the
the HL2
HL2
processes
processes and
and visually
visually represents
represents the
the retrieved
retrieved data,
data, projecting
projecting itit within
within the
the mixed
mixed reality
reality
environment in real-time. This allows for immediate and interactive
environment in real-time. This allows for immediate and interactive structural health structural health
in-
inspections withinthe
spections within thephysical
physicalcontext.
context.
Figure
Figure 22 shows
shows aa flowchart
flowchart ofof the
the proposed
proposed data
data retrieval
retrieval system.
system. Once
Once deployed
deployed to to
an HL2 model, users can send HTTP requests through the application
an HL2 model, users can send HTTP requests through the application to execute Hyper- to execute Hypertext
Preprocessor (PHP)(PHP)
text Preprocessor scriptsscripts
on theon
server via Wi-Fi.
the server via The PHP
Wi-Fi. Thescripts
PHPused in this
scripts used research
in this are
re-
responsible for querying the MySQL database and echoing retrieved
search are responsible for querying the MySQL database and echoing retrieved data data back to the Unity
back
application. This enables the effective visualization of real-time experiment data within the
to the Unity application. This enables the effective visualization of real-time experiment
AR environment and encompasses additional functionalities such as frequency-domain
visualization and modal property estimations from the output data. This paper provides an
overview of the development and authentication process of this new application. A deeper
dive into the separate modules of the framework (i.e., application development, server
communication, and data visualization) is subsequently explored. Figure 3 highlights the
relationship and flow between the individual modules.
data within the AR environment and encompasses additional functionalities such as fre-
quency-domain visualization and modal property estimations from the output data. This
paper provides an overview of the development and authentication process of this new
Sensors 2024, 24, 1609
application. A deeper dive into the separate modules of the framework (i.e., application
6 of 20
development, server communication, and data visualization) is subsequently explored.
Figure 3 highlights the relationship and flow between the individual modules.
[Link]
Figure Relationshipand
andcommunications
communicationsbetween
betweenthree
threemodules
modulesofofthe
theproposed
proposedmethodology.
methodology.
xk+1 = Axk + wk
yk = Cxk + zk
where:
xk is the state vector of dimension n at time k.
wk is the process noise due to disturbances and modeling inaccuracies.
yk is the output vector.
zk is the measurement noise.
A is the state matrix.
C is the observation matrix.
Sensors 2024, 24, 1609 8 of 20
λi = ln(µi )/∆t
fi = |λi |/(2π)
• Calculate the damping ratios ζ i from the real and absolute parts of the continuous-
time eigenvalues:
ζ i = real(λi )/|λi |
4. Mode Shapes:
The mode shapes ϕi are the columns of the matrix V = CΨ, where Ψ is the eigenvector
matrix of A:
V = [ϕ1 , . . ., ϕi , . . ., ϕn ]
The data collected from the accelerometers is automatically written to a text file with
a specified path name on the server. To facilitate the seamless transfer of data to the
MySQL database, a Python script named ‘[Link]’ is written. This script
periodically monitors the text file and attempts to upload new data to the database at
intervals of approximately three seconds. This functionality is achieved by implementing
a variable, ‘last_processed_line’, which keeps track of the last processed line in the text
file. During each loop, the script compares the length of the text file in rows to the value
of ‘last_processed_line’, preventing duplicate data from being uploaded. If the values are
equal, no new data is added to the database, and no actions are performed. However, if
the length of the text file exceeds ‘last_processed_line’, all new lines of data are stored in
the MySQL database, and the value of ‘last_processed_line’ is updated to match the length
of the text file. When the database table is cleared, the value of ‘last_processed_line’ is
reset to zero to ensure accurate data uploads for subsequent experiments. Communication
between the HL2 application and the server was established through PHP and SQL queries,
enabling access to data stored within the MySQL database on the server. The PHP scripts
used in this experiment, along with their functionalities, are outlined in Table 1.
Script Function
Retrieves existing data from the MySQL database, if any,
[Link]
and echoes the result.
Sends existing data from the MySQL database to
[Link]
CESSIPy and echoes the output mode shape columns.
Deletes any existing data in the MySQL database so a
[Link]
new experiment can be performed.
Figure 5. Result of using the hand menu to reveal the system identification information.
Figure 5. Result of using the hand menu to reveal the system identification information.
3. Frequency-domain
TableTable visualization
3. Frequency-domain interface
visualization button
interface functionalities.
button functionalities.
Button
Button Function
Function
Clearsthe
Clears the graph
graph display.
display.
Clear Graphs
Clear Graphs
[Link]
Figure Usingthe
the“View
“ViewData”
Data”button
buttontotoshow
showtime-domain
time-domaindata
datawithin
withinthe
theUnity
Unitytest
testscene.
scene.
[Link]
3.4.3. VisualizationofofSystem
Frequency-Domain
IdentificationInformation
Information
Thestable
The frequency-domain
mode shapes visualization
estimation windowwindow is is responsible
responsible forfor calculating and
communicating dis-
with
the CESSIPy Python library, which employs SSI techniques to identify eigenfrequencies,
playing the FFT of the acquired time-series data shown in Figure 6. This is achieved by
damping
using theratios, and mode
open-source shapes from
‘[Link]’ output-only
numeric library data.
in UnityThistoisperform
achievedforward
by estimating
Fourier
the matriceson
transforms from the SSI
all three COVstreams
sensor [Link]
Similar to thethe
plotting frequency-domain visualization
results. The frequency-domain
window, this interface
visualization interfacecontains
containstwotwobuttons: ‘View
buttons: ‘View SSI’ andand
FFT’ ‘Clear Graphs’,
‘Clear as detailed
Graphs’, in
each with
Table 4. The HL2detailed
functionalities device finally displays
in Table 3. HL2 mode
is ableshape estimations
to calculate from sample
and display acceleration
frequency-domain
data using CESSIPy-
information graphed
from locally in the
stored right-most
sample section
data upon user of prompt
Figure [Link] in the mid-sec-
tion of Figure 4.
Button Function
Performs an FFT on the acceleration data plotted in
the time-domain visualization window and plots the
results.
View FFT
the CESSIPy Python library, which employs SSI techniques to identify eigenfrequencies,
damping ratios, and mode shapes from output-only data. This is achieved by estimating
the matrices from the SSI COV method. Similar to the frequency-domain visualization
window, this interface contains two buttons: ‘View SSI’ and ‘Clear Graphs’, as detailed in
Sensors 2024, 24, 1609 Table 4. The HL2 device finally displays mode shape estimations from sample acceleration
12 of 20
data using CESSIPy- graphed in the right-most section of Figure 4.
[Link]
Table Stablemode
modeshapes
shapesinterface
interfacebutton
buttonfunctionalities.
functionalities.
Button
Button Function
Function
Calls “[Link]”
Calls “[Link]” to send
sendexperiment
experiment data
data to to
CESSIPy,
CESSIPy, returnsand
returns and plots
plots the
theresulting
resultingestimations.
estimations.
View SSI
View SSI
Clearsthe
Clears the graph
graph display.
display.
Clear Graphs
Clear Graphs
[Link]
Resultsand
andDiscussion
Discussions
4.1.
[Link]
ExperimentalSetup
Setup
ToTotest
testthe
theperformance
performanceofofthe theapplication
applicationrunning
runningon onaaHL2HL2device
deviceininreal-time,
real-time,aa
three-degree-of-freedom
three-degree-of-freedom (DOF)
(DOF) model
modelis mounted
is mountedon aonshake tabletable
a shake (APS(APS
113 manufactured
113 manufac-
by APSby
tured Dynamics, San Juan
APS Dynamics, SanCapistrano, CA, USA).
Juan Capistrano, CA,Each
USA).story
Eachmeasures 200 mm200
story measures in height
mm in
and 250 mm in width. The story slabs have a thickness of 10 mm and
height and 250 mm in width. The story slabs have a thickness of 10 mm and are supported are supported
bybystainless
stainlesssteel
steelcolumns
columnsofof4040mm mm××33mm. [Link]
TheAPSAPS113 113isisaalong-stroke
long-strokeshaker
shakerwith
with
linear
linear ball bearings. The output excitation signal is regulated via its power [Link]
ball bearings. The output excitation signal is regulated via its power amplifier. The
data
dataacquisition
acquisitionsystem
system is is composed
composed of of the
thePXIe-1092
PXIe-1092chassis
chassis(manufacturer:
(manufacturer:National
NationalIn-
Instruments, Austin, TX, USA), which manages the input and output
struments, Austin, TX, USA), which manages the input and output signals. A Lenovo signals. A Lenovo
ThinkStation PC (3.70 Ghz@4.50 Ghz, 32 GB Ram, 1.0 TB SSD, Nvidia Geforce, Santa Clara,
ThinkStation PC (3.70 Ghz@4.50 Ghz, 32 GB Ram, 1.0 TB SSD, Nvidia Geforce, Santa Clara,
CA, USA) hosts the software that manages the input and output signals. Three wired
CA, USA) hosts the software that manages the input and output signals. Three wired ac-
accelerometers (i.e., sensitivity
celerometers (i.e., (±10%):1000
sensitivity (±10%): 1000mV/g, range:±5±g5 pk)
mV/g,range: g pk) aresecurely
are securelyattached
attachedat
ateach
eachstory
storylevel
levelofofthe
thestructure.
structure.A white noise random2 excitation ranging from 1 to1 100
A white noise random excitation ranging from to
100 Hz with an acceleration root mean square of 0.5 m/s is executed. Figure 7a sums up
Hz with an acceleration root mean square of 0.5 m/s2 is executed. Figure 7a sums up the
the setup of the mounted structure on the shaker, while Figure 7b represents the entire Data
setup of the mounted structure on the shaker, while Figure 7b represents the entire Data
Acquisition (DAQ) chassis system.
Acquisition (DAQ) chassis system.
4.2. Application of Proposed AR-Based Visualization
While the shake table is initialized, the “Start Capture” feature is activated from
within the AR application to check the data file for new acceleration data. Because the
accelerometers capture data in voltage, the data must first be converted to acceleration at a
rate of 106.0 mV per m/s2 before being processed by the Unity app. Via three sensors, the
time-history data is seamlessly streamed to the HL2 device and plotted in parallel. The data
is collected for 60 s at a frequency of 100 Hz, resulting in 6000 active plotted data points
per channel without any noticeable performance drops. The virtual graphs are constantly
updated throughout the inspection period. The system allows the observation and analysis
of modal information over a continuous data collection period while standing directly over
the targeted structure, as represented in Figure 8a. Figure 8b shows the user accessing the
developed application (TimeHistorySHM) on the HL2 system. The live visualization of the
updated charts characterizes the system as a real-time monitoring process while securing a
comprehensive modal analysis and system identification strategy. In addition, the system
has the capability of displaying the last updated versions of the time-domain graphs, as
seen in Figure 9, if any temporary manual or technical interruptions in the connection occur.
Subsequent interfaces are used to calculate and display the frequency domain, as shown
in Figure 10, in addition to the mode shape estimation using SSI, highlighted in Figure 11,
which is estimated using the proposed application.
Sensors 2024, 24, 1609 Sensors 2024, 24, x FOR PEER REVIEW 13 of 20
13 of 20
APS shaker
(a)
Chassis
Amplifier
PC setup
time-domain graphs, as seen in Figure 9, if any temporary manual or technical interrup-
(b)
tions in the connection occur. Subsequent interfaces are used to calculate and display the
frequency domain, as shown
Figure in Figure 10,
7. The experimental in addition
setup: to the(b)
(a) 3DOF model mode
DAQshape
chassisestimation
system. using
Figure
SSI, 7. The experimental
highlighted in Figuresetup: (a) 3DOF
11, which model (b)
is estimated DAQ
using chassis
the system.
proposed application.
4.2. Application of Proposed AR-Based Visualization
While the shake table is initialized, the “Start Capture” feature is activated from
within the AR application to check the data file for new acceleration data. Because the
accelerometers capture data in voltage, the data must first be converted to acceleration at
a rate of 106.0 mV per m/s2 before being processed by the Unity app. Via three sensors,
the time-history data is seamlessly streamed to the HL2 device and plotted in parallel. The
data is collected for 60 s at a frequency of 100 Hz, resulting in 6000 active plotted data
points per channel without any noticeable performance drops. The virtual graphs are con-
stantly updated throughout the inspection period. The system allows the observation and
analysis of modal information over a continuous data collection period while standing
directly over the targeted structure, as represented in Figure 8a. Figure 8b shows the user
accessing the developed application (TimeHistorySHM) on the HL2 system. The live vis-
ualization of the updated charts characterizes the system as a real-time monitoring pro-
cess while securing a comprehensive modal analysis and system identification strategy.
In addition, the system has the capability of displaying the last updated versions of the
(a) (b)
Figure 8. (a) Tracking the targeted structure using HL2 and (b) accessing the AR application of time-
Figure 8. (a) Tracking the targeted structure using HL2 and (b) accessing the AR application of
domain information.
time-domain information.
(a) (b)
Sensors 2024, 24, 1609 14 of 20
Figure 8. (a) Tracking the targeted structure using HL2 and (b) accessing the AR application of time-
domain information.
(a)
Figure 10. Cont.
Sensors 2024, 24, 1609 15 of 20
(a)
(b)
Figure 10.
Figure 10.(a)
(a)Accessing
Accessingthe
theFFT
FFTfeature
feature and
and thethe
(b)(b) capture
capture of real-time
of real-time frequency-domain
frequency-domain infor-
information.
mation.
The end visualization display strives to achieve a monitoring process under the
structure’s normal behavioral conditions. It can be observed that the proposed tool can
Sensors 2024, 24, x FOR PEER REVIEW
visualize all the essential diagnostics information associated with SHM data, as shown in
Figure 12, which may be useful for the maintenance engineer or inspector onsite.
(a)
Figure 11. Cont.
Sensors 2024, 24, 1609 16 of 20
(a)
(b)
Figure [Link]
Figure 11. (a) Accessing (a)mode
Accessing
shapethe mode and
feature shape feature
the and the
(b) capture of(b) capturemodal
real-time of real-time modal identificati
identification
Sensors 2024, 24, x FOR PEER REVIEW 17 of 20
information. information.
The end visualization display strives to achieve a monitoring process under the stru
ture’s normal behavioral conditions. It can be observed that the proposed tool can visua
ize all the essential diagnostics information associated with SHM data, as shown in Figu
12, which may be useful for the maintenance engineer or inspector onsite.
5. Conclusions
In this paper, a visualization framework is developed for real-time structural inspec-
tion data using AR. The proposed visualization interface consists of three main sections:
time-domain visualization, frequency-domain visualization, and mode-shape visualiza-
tion. The time-domain visualization window allows the user to visualize real-time struc-
tural experiment data from a remote database via Wi-Fi. Subsequently, the frequency-do-
Sensors 2024, 24, 1609 17 of 20
5. Conclusions
In this paper, a visualization framework is developed for real-time structural inspection
data using AR. The proposed visualization interface consists of three main sections: time-
domain visualization, frequency-domain visualization, and mode-shape visualization.
The time-domain visualization window allows the user to visualize real-time structural
experiment data from a remote database via Wi-Fi. Subsequently, the frequency-domain
visualization window can be used to natively compute and visualize Fourier transforms on
the acquired time-series data. The mode shape estimations are obtained using CESSIPy,
a Python library for stochastic system identification in civil engineering. To facilitate
communication between the HL2 device and the server, PHP scripts are used. All PHP and
Python scripts are contained in a local XAMPP development environment, which includes
an Apache web server for processing HTTP requests and a MySQL database for storing
live experiment data. These resources are accessed on the HL2 device through a Unity app
via HTTP requests.
The accuracy of the module used in this research was assessed by performing a shake
table experiment and visualizing its time domain and frequency domain information in
MATLAB. The results were then compared to the data visualized within the HL2 device to
ensure the data was being processed correctly. Additionally, a wrapper function from the
CESSIPy library was called directly on the server to retrieve the estimated eigenfrequencies,
damping ratios, and modal shapes of the model. The data was then compared with the HL2
visualizations to ensure accurate data transmission. Based on the results of this research,
the following conclusions can be drawn:
(a) The program can accurately plot up to 18,000 points of acceleration data across
three subplots in real-time without any noticeable performance drops. Within the
AR environment, users can perform forward Fourier transforms with the collected
experiment data at any point during the experiment. The results of these analyses
are instantly stored and displayed on the HL2 device. This capability is pivotal for
instantaneous data analysis and visualization in structural health monitoring.
(b) The application seamlessly communicates with an external Python module via HTTP
requests, enabling it to perform SSI COV from recorded acceleration data. The Python
module is hosted on a web development server and works in conjunction with PHP
scripts to facilitate data transmission with the Unity application. In the future, a similar
approach can be employed to establish communications with additional libraries,
further expanding the capabilities of integrating AR with SHM.
(c) The system’s versatility extends beyond location constraints, as data transmission
occurs seamlessly over Wi-Fi whether in the lab or field. If the server and HL2 are
connected to the same network and the server scripts are updated to accommodate
any network changes, users can access and use the system virtually anywhere.
The existing application development used in this research has certain limitations that
should be considered for future enhancements. For example, as the number of plotted
points increases or the experiment’s sampling frequency rises, the performance of the
application on real hardware may be impacted negatively. In this research, the program
was tested with 60 s of data across three sensors recording at 100 Hz, resulting in 6000 overall
points per channel being plotted in real-time without any noticeable performance drops.
While there is no set limit for plotting data in the application, substantially exceeding
this threshold could result in a decreased frame rate and delayed script execution. As
communication between the HL2 and the server is achieved through a wireless internet
connection, the existing application is also constrained by the limitations of a wireless local
area network. This may include range- or bandwidth-related limitations that could affect
the performance of the program. When used in a new local area network, the internet
protocol (IP) address referenced by the Unity application must be updated appropriately
before the program can be used again. Finally, depending on the method of data collection,
the parameters used in the SSI functions called from the external Python library may need to
be manually adjusted for different experiments to ensure accurate mode shape estimations.
Sensors 2024, 24, 1609 18 of 20
The advancements outlined in this research not only contribute to the accuracy of
structural assessments but also significantly expedite data analysis while enhancing the
overall safety of inspections. The proposed research holds the potential to revolutionize the
field of structural health monitoring by introducing more efficient, data-driven, and real-
time visualization. Future research endeavors will focus on testing with additional degrees
of freedom and real-world case studies and expanding the scope of external modules
that can be integrated with AR. This study is a fundamental stepping stone in SHM for
real-time visualization of time-series data via AR. It paves the path to intelligent onsite
visual inspections such as visualizing onsite modal estimates, characterizing structural
damage, considering thermal and environmental correlations and influences, and tackling
successive long-term variations of structural parameters.
Author Contributions: Conceptualization, E.C. and M.S.; Validation, E.C.; Formal analysis, E.C.; In-
vestigation, M.S.; Writing—original draft, E.C. and M.S.; Writing—review & editing, A.S.; Supervision,
A.S. All authors have read and agreed to the published version of the manuscript.
Funding: The proposed research was funded by the Natural Sciences and Engineering Research
Council (NSERC) of Canada through the corresponding author’s Discovery Grant as well as the
Western University’s Undergraduate Student Research Internship. The authors also thank the Western
Academy for Advanced Research (WAFAR) at Western University for providing financial support
through the Western Fellowship to the corresponding author.
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Data are contained within the article.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Ge, L.; Dan, D.; Sadhu, A. A benchmark dataset for vision-based traffic load monitoring in a cable-stayed bridge. J. Bridge Eng.
2024, 29, 04723001. [CrossRef]
2. Hassani, S.; Dackermann, U. A Systematic Review of Advanced Sensor Technologies for Non-Destructive Testing and Structural
Health Monitoring. Sensors 2023, 23, 2204. [CrossRef]
3. Prakash, G.; Dugalam, R.; Barbosh, M.; Sadhu, A. Recent advancement of concrete dam health monitoring technology: A
systematic literature review. Structures 2022, 44, 766–784. [CrossRef]
4. Ranyal, E.; Sadhu, A.; Jain, K. Enhancing pavement health assessment: An attention-based approach for accurate crack detection,
measurement and mapping. Expert Syst. Appl. 2024, 247, 123314. [CrossRef]
5. Sabato, A.; Dabetwar, S.; Kulkarni, N.N.; Fortino, G. Noncontact Sensing Techniques for AI-Aided Structural Health Monitoring:
A Systematic Review. IEEE Sens. J. 2023, 23, 4672–4684. [CrossRef]
6. Singh, P.; Mittal, S.; Sadhu, A. Recent advancements and future trends in indirect bridge health monitoring. J. Pract. Period. Struct.
Des. Constr. 2023, 28, 03122008. [CrossRef]
7. Sony, S.; LaVenture, S.; Sadhu, A. A literature review of next-generation smart sensing technology in structural health monitoring.
Struct. Control Health Monit. 2019, 26, e2321. [CrossRef]
8. Sony, S.; Dunphy, K.; Sadhu, A.; Capretz, M. A systematic review of convolutional neural network-based structural condition
assessment techniques. Eng. Struct. 2021, 226, 111347. [CrossRef]
9. Entezami, A.; Sarmadi, H.; Behkamal, B.; Mariani, S. Big Data Analytics and Structural Health Monitoring: A Statistical Pattern
Recognition-Based Approach. Sensors 2020, 20, 2328. [CrossRef] [PubMed]
10. Boddupalli, C.; Sadhu, A.; Rezazadeh Azar, E.; Pattyson, S. Improved visualization of infrastructure monitoring data using
building information modeling. Struct. Infrastruct. Eng. 2019, 15, 1247–1265. [CrossRef]
11. Sakr, M.; Sadhu, A. Visualization of structural health monitoring information using internet-of-things integrated with building
information modeling. J. Infrastruct. Intell. Resil. 2023, 2, 100053. [CrossRef]
12. Alvarez-Marin, A.; Velazquez-Iturbide, J.A. Augmented Reality and Engineering Education: A Systematic Review. IEEE Trans.
Learn. Technol. 2021, 14, 817–831. [CrossRef]
13. Wyckoff, E.; Reza, R.; Moreu, F. Feedback and Control of Dynamics and Robotics using Augmented Reality. arXiv 2023,
arXiv:2303.13016. [CrossRef]
14. Kinnen, T.; Blut, C.; Effkemann, C.; Blankenbach, J. Thermal reality capturing with the Microsoft HoloLens 2 for energy system
analysis. Energy Build. 2023, 288, 113020. [CrossRef]
Sensors 2024, 24, 1609 19 of 20
15. Sadhu, A.; Peplinski, J.E.; Mohammadkhorasani, A.; Moreu, F. A Review of Data Management and Visualization Techniques for
Structural Health Monitoring Using BIM and Virtual or Augmented Reality. J. Struct. Eng. 2022, 149, 03122006. [CrossRef]
16. Xu, J.; Moreu, F. A Review of Augmented Reality Applications in Civil Infrastructure During the 4th Industrial Revolution. Front.
Built Environ. 2021, 7, 640732. [CrossRef]
17. AL-Dhaimesh, S.H.; Taib, N. A Review: Investigation of Augmented Reality—BIM Benefits in Design Process in AEC Industry.
Informatica 2023, 47, 111. [CrossRef]
18. Meža, S.; Turk, Ž.; Dolenc, M. Component based engineering of a mobile BIM-based augmented reality system. Autom. Constr.
2014, 42, 1–12. [CrossRef]
19. Williams, G.; Gheisari, M.; Chen, P.-J.; Irizarry, J. BIM2MAR: An Efficient BIM Translation to Mobile Augmented Reality
Applications. J. Manag. Eng. 2015, 31, A4014009. [CrossRef]
20. Barazzetti, L.; Banfi, F.; Brumana, R.; Oreni, D.; Previtali, M.; Roncoroni, F. HBIM and augmented information: Towards a wider
user community of image and range-based reconstructions. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. ISPRS Arch. 2015,
40, 35–42. [CrossRef]
21. Salamak, M.; Januszka, M. BIM Models and Augmented Reality for Concrete Bridge Inspections. In Proceedings of the 11th
Central European Congress on Concrete Engineering, Hainburg, Austria, 1–2 October 2015. [CrossRef]
22. Yamaguchi, T.; Shibuya, T.; Kanda, M.; Yasojima, A. Crack Inspection Support System for Concrete Structures Using Head
Mounted Display in Mixed Reality Space. In Proceedings of the 2019 58th Annual Conference of the Society of Instrument and
Control Engineers of Japan (SICE), Hiroshima, Japan, 10–13 September 2019.
23. Mascareñas, D.D.L.; Ballor, J.A.P.; McClain, O.L.; Mellor, M.A.; Shen, C.Y.; Bleck, B.; Morales, J.; Yeong, L.M.R.; Narushof, B.;
Shelton, P.; et al. Augmented reality for next generation infrastructure inspections. Struct. Health Monit. 2021, 20, 1957–1979.
[CrossRef]
24. Awadallah, O.; Sadhu, A. Automated multiclass structural damage detection and quantification using augmented reality. J.
Infrastruct. Intell. Resil. 2023, 2, 100024. [CrossRef]
25. Palma, V.; Iovane, G.; Hwang, S.; Mazzolani, F.M.; Landolfo, R.; Sohn, H.; Faggiano, B. Innovative technologies for structural
health monitoring of SFTs: Proposal of combination of infrared thermography with mixed reality. J. Civ. Struct. Health Monit.
2023, 13, 1653–1681. [CrossRef]
26. Mohammadkhorasani, A.; Malek, K.; Mojidra, R.; Li, J.; Bennett, C.; Collins, W.; Moreu, F. Augmented reality-computer vision
combination for automatic fatigue crack detection and localization. Comput. Ind. 2023, 149, 103936. [CrossRef]
27. Natephra, W.; Motamedi, A. Live data visualization of IoT sensors using augmented reality (AR) and BIM. In Proceedings of the
36th International Symposium on Automation and Robotics in Construction, ISARC 2019, Banff, AB, Canada, 21–24 May 2019;
pp. 632–638. [CrossRef]
28. Chai, C.; Mustafa, K.; Kuppusamy, S.; Yusof, A.; Lim, C.S.; Wai, S.H. Bim integration in augmented reality model. Int. J. Technol.
2019, 10, 1266–1275. [CrossRef]
29. Liu, D.; Xia, X.; Chen, J.; Li, S. Integrating Building Information Model and Augmented Reality for Drone-Based Building
Inspection. J. Comput. Civ. Eng. 2021, 35, 04020073. [CrossRef]
30. Fawad, M.; Salamak, M.; Hanif, M.U.; Koris, K.; Ahsan, M.; Rahman, H.; Gerges, M.; Salah, M.M. Integration of Bridge Health
Monitoring System With Augmented Reality Application Developed Using 3D Game Engine–Case Study. IEEE Access 2024, 12,
16963–16974. [CrossRef]
31. Sharma, A.; Kosasih, E.; Zhang, J.; Brintrup, A.; Calinescu, A. Digital Twins: State of the art theory and practice, challenges, and
open research questions. J. Ind. Inf. Integr. 2022, 30, 100383. [CrossRef]
32. Schroeder, G.; Steinmetz, C.; Pereira, C.E.; Muller, I.; Garcia, N.; Espindola, D.; Rodrigues, R. Visualising the digital twin using
web services and augmented reality. In Proceedings of the 2016 IEEE International Conference on Industrial Informatics (INDIN),
Poitiers, France, 19–21 July 2016; pp. 522–527. [CrossRef]
33. Revetria, R.; Tonelli, F.; Damiani, L.; Demartini, M.; Bisio, F.; Peruzzo, N. A Real-Time Mechanical Structures Monitoring System
Based On Digital Twin, Iot and Augmented Reality; A Real-Time Mechanical Structures Monitoring System Based On Digital
Twin, Iot and Augmented Reality. In Proceedings of the 2019 Spring Simulation Conference (SpringSim), Tucson, AZ, USA, 29
April–2 May 2019.
34. Nguyen, D.-C.; Kang, M.; Jang, D.; Shim, C. Developing Mixed Reality-Based Digital Twin Model for Bridge Maintenance System.
In Proceedings of the 22nd International Conference on Construction Applications of Virtual Reality (CONVR 2022), Seoul,
Republic of Korea, 16–19 November 2022.
35. Napolitano, R.; Blyth, A.; Glisic, B. Virtual environments for visualizing structural health monitoring sensor networks, data, and
metadata. Sensors 2018, 18, 243. [CrossRef]
36. Napolitano, R.; Liu, Z.; Sun, C.; Glisic, B. Combination of image-based documentation and augmented reality for structural health
monitoring and building pathology. Front. Built Environ. 2019, 5, 50. [CrossRef]
37. Aguero, M.; Maharjan, D.; del Pilar Rodriguez, M.; Mascarenas, D.D.L.; Moreu, F. Design and implementation of a connection
between augmented reality and sensors. Robotics 2020, 9, 3. [CrossRef]
38. Aguero, M.; Doyle, D.; Mascarenas, D.; Moreu, F. Visualization of real-time displacement time history superimposed with
dynamic experiments using wireless smart sensors and augmented reality. Earthq. Eng. Eng. Vib. 2023, 22, 573–588. [CrossRef]
Sensors 2024, 24, 1609 20 of 20
39. Giglioni, V.; Poole, J.; Venanzi, I.; Ubertini, F.; Worden, K. A domain adaptation approach to damage classification with an
application to bridge monitoring. Mech. Syst. Signal Process. 2024, 209, 111135. [CrossRef]
40. Architecture Overview—MRTK 2 | Microsoft Learn. Available online: [Link]
reality/mrtk-unity/mrtk2/architecture/overview?view=mrtkunity-2022-05 (accessed on 14 November 2023).
41. knutankv/koma: Mode Visualization and Animation (Slightly Experimental) added. Available online: [Link]
records/10677831 (accessed on 19 February 2024).
42. Pasca, D.P.; Aloisio, A.; Rosso, M.M.; Sotiropoulos, S. PyOMA and PyOMA_GUI: A Python module and software for Operational
Modal Analysis. SoftwareX 2022, 20, 101216. [CrossRef]
43. Carini, M.R.; Rocha, M.M. CESSIPy: A Python open-source module for stochastic system identification in civil engineering.
SoftwareX 2022, 18, 101091. [CrossRef]
44. Marrongelli, G.; Magalhães, F.; Cunha, Á. ScienceDirect ScienceDirect Automated Operational Modal Analysis of an arch bridge
considering the influence of the parametric methods inputs. Procedia Eng. 2017, 199, 2172–2177. [CrossRef]
45. Rosso, M.M.; Aloisio, A.; Parol, J.; Marano, G.C.; Quaranta, G. Intelligent automatic operational modal analysis. Mech. Syst. Signal
Process. 2023, 201, 110669. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.