Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
29 views5 pages

Ls 2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views5 pages

Ls 2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Proceedings of the International Conference on Sustainable Computing and Smart Systems (ICSCSS 2023)

IEEE Xplore Part Number: CFP23DJ3-ART; ISBN: 979-8-3503-3360-2

Augmented Reality-based Indoor


Navigation using Unity Engine
2023 International Conference on Sustainable Computing and Smart Systems (ICSCSS) | 979-8-3503-3360-2/23/$31.00 ©2023 IEEE | DOI: 10.1109/ICSCSS57650.2023.10169855

Rokesh Maran.B Giridharan.L Krishnaveni.R


Department of Computer Science & Department of Computer Science & Department of Computer Science &
Engineering Engineering Engineering
Hindustan Institute of technology and Hindustan Institute of technology and Hindustan Institute of technology and
Science Science Science
Chennai,India Chennai,India Chennai,India
[email protected] [email protected] [email protected]

Abstract—The evolution of technology with the require any high-end hardware and any inertial sensors which
introduction of mobile devices has provided users with makes it time and cost-efficient. Augmented reality
day-to-day advancements in existing technologies. implementation and visualization require a specialized-devices
Augmented Reality (AR) is combined with an Artificial that can support graphical interfaces and AR foundation-provided
Intelligence navigation agent to track the environment that SDKs. There are several methods for visualization of virtual and
is provided specifically. Unity 3D engine provides an augmented reality-based applications. The mode of projection and
inbuilt AR framework easily accessible and the navigation visualization must be considered before developing an AR
mesh agent tracks penetrating paths with segregated walk-
application. [15] Different technologies have different navigation
able areas and areas that cannot be accessed. The current
mediums and display different data. Some examples are pointing
machine learning systems use inertial sensors and high-
end cameras for computer vision-based motion tracking. arrows in the intended direction or painting roads in bold colors.
Whereas, this system does not require external hardware It is proposed to design the user interface from the user's point of
as the environment model is constructed with provided view. H. Routes should be mapped separately from existing paths.
measurements in Unity Engine and virtual trackers, which The route to the destination is highlighted or drawn separately.
can be functioned in any AR-enabled device. The proposed Additionally, turns and destinations blocked by obstructed vision
model aims to provide the framework with real-time should also be made visible by placing the overlying lanes in the
virtual visualization for a better user experience. obstacle course. The pedestrian priority plan is the same. [18]
Keywords—Augmented Reality, Real-Time Navigation, II. RELATED WORK
UnityEngine, Indoor Navigation, Android Navigation.
Jacobe et al. advise using an augmented reality (AR)-based
I. INTRODUCTION smart-phone app to enhance the visitor experience at the Wari
Navigation techniques have evolved over the ages. Willka Museum in Peru. The authors begin by outlining the
People can easily navigate from one place to other with a drawbacks of traditional museum visits and the potential benefits
visible route example, Google Maps. Today's systems use of utilizing AR-based technology to provide visitors with a more
satellite-based GPS navigation technology widely to track engaging and interactive experience. They go on to describe the
the path of the desired destination. There is an equal amount development of their mobile app, which gives visitors access to
of limitations in using GPS-based location trackers. AR elements including virtual tours, 3D models, and multimedia
presentations via their smart-phones. [8]
Augmented Reality mixes real-world technology and is
described as -a "combination of real and virtual world Oregon state university application provides student
information, interact at real-time, and are registered in 3D information and campus data. The application can be accessed
space"[1]. Augmented Reality can be used to display from a web browser of any device that supports virtual reality it
navigation information in real-time. Visualizing the pathway can visualize real-time data. The application can navigate from
in real space makes it easily understandable. It can be buildings to a building. It also displays user information by
visualized with the help of an AR device in which the virtual scanning the fingerprint [12].
data combined with real-time space is displayed to the users. The Arizona mobile app is a cutting-edge platform created to
Unity 3D engine can be used for mixed reality and graphical give university students virtual information visualization. This
interface rendering and development, Such as education application was created to provide quick and easy access to a
virtual reality systems [2]. Unity 3D is used in many aspects, variety of materials that students frequently use. The software
especially in game development, animation, virtual reality includes features like maps that make it easy for students to
interactions, and optimizing the project. [12] Computer navigate the campus and find crucial resources like classrooms,
vision- based indoor classification and path detection [3] is libraries, and dining areas. [11]
hardware dependent, Whereas AR-based navigation does not

979-8-3503-3360-2/23/$31.00 ©2023 IEEE 1696


Authorized licensed use limited to: COCHIN UNIVERSITY OF SCIENCE AND TECHNOLOGY. Downloaded on October 19,2024 at 04:01:07 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the International Conference on Sustainable Computing and Smart Systems (ICSCSS 2023)
IEEE Xplore Part Number: CFP23DJ3-ART; ISBN: 979-8-3503-3360-2
MIT campus Navigation tool acts as a navigation guide computer vision systems. Insufficient or uneven lighting can lead to
providing visualized navigation system with the help of reduced contrast, shadows, and reflections, making it challenging for
guided AR tools. It provides the user with a target location algorithms to detect and extract meaningful visual features. [20]
waypoint and displays the information data of the students.
Some features of this application include student PROPOSED ARCHITECTURE
information visualization, Photos and video projection,
and campus navigation
Additionally, AR technology may have limitations in
certain areas of the campus, such as buildings with poor
GPS reception or complex indoor environments where
accurate positioning may be challenging. Technical issues,
such as glitches, inaccuracies, or compatibility problems
with different devices, could also arise, potentially causing
frustration and hindering the navigation experience. It is
crucial to strike a balance between leveraging the benefits
of AR-based navigation and ensuring that users remain
present and connected to the campus environment. [13].
Bakhmut et al. created a web application based on
augmented reality for their research that allows users to
take virtual tours of the campus. The writers overlaid 3D
reconstructions of campus buildings and landmarks onto
actual photographs taken using a smart phone camera using
image recognition and tracking technology. The
application also included further information on each Fig.1. Architecture Diagram of Navigation Model
location's historical and cultural significance. Additionally,
relying solely on an AR web application may limit the
In Fig 1, the workflow of the application design is explained.
depth and breadth of information that can be provided
The floor measurements and locations are measured and classified
compared to in-person guided tours. Virtual tours may
the data is further uploaded for visualization. The floor plan is
lack the personalized touch and ability to ask questions in
converted into a 3D floor environment model and the meshes are
real-time, which can be important for prospective
segregated into walls, floors, etc. in the unity engine. Separate
students or visitors seeking specific details or insights.
objects are placed in the model for target detection and new 2D
Furthermore, technical issues such as compatibility
sprites are assigned for better visualization. The given 3D models
problems, connectivity disruptions, or app performance
are then configured with Augmented reality and path-finding to
inconsistencies may arise, affecting the overall user
provide navigation .
experience.[9]
A. Real-time Target Detection
Laura Ruotsalainen, Aiden Morrison, and Maija
Mäkelä study states that Navigation entails employing a Targets Real-times show that they can be accessed and tracked.
system that can localize itself without the use of any The engine detects and traces the Real time by comparing the natural
hardware that has been pre-installed in the structure by features extracted from the camera from real world space. Once
keeping track of the user's movement with a variety of Real-time Target is detected the AR session origin will track the
sensors and it can be achieved through collaborative indoor Real-time environment and add your content easily using market
navigation. The system uses inertial sensors and high-level Real-time tracking technology. Target environment that can be
video cameras to capture and recognize the environment accessed by AR camera and detected as a plane using AR session.
with computer vision. One of the demerits of improving The engine detects and traces the plane by comparing the natural
computer vision-based perception for collaborative indoor features extracted fromthe camera image against a well-known site.
navigation is the reliance on environmental conditions and Once plane is detected, AR foundation will track the environment
infrastructure. In complex or cluttered indoor spaces, and add yourcontent easily using market image tracking technology.
computer vision algorithms may struggle to accurately B. Implement Virtual Elements
interpret the visual information due to the presence of Visual elements request the interaction of your moving from
occlusions, overlapping objects, or ambiguous features. screen interaction to the real world. Learn from the Visible Buttons
This can result in difficulties in object recognition, sample how to use and adjust the Visual tracker and immerse your
tracking, and scene understanding. Moreover, poor lighting end users in your AR application. Visual trackers provide a useful
conditions can further degrade the performance of way to navigate from a targeted location based on images.

979-8-3503-3360-2/23/$31.00 ©2023 IEEE 1697


Authorized licensed use limited to: COCHIN UNIVERSITY OF SCIENCE AND TECHNOLOGY. Downloaded on October 19,2024 at 04:01:07 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the International Conference on Sustainable Computing and Smart Systems (ICSCSS 2023)
IEEE Xplore Part Number: CFP23DJ3-ART; ISBN: 979-8-3503-3360-2
based navigation tracking. The AR tracker can be toggled ON/OFF
III. SYSTEM DESIGN using the UI displayed on the screen of the same interface. The
A. AR and Path finding Framework AR part of the project is made possible by the Unity 3D Platform
and AR foundation where an app is created to recognize the real-
The AR Foundation framework is a powerful tool
time plane target using a mobile phone camera that shows the virtual
that offers a cross-platform AR software development kit
buttons. These virtual buttons are intractable in the physical world,
(SDK) for implementing augmented reality experiences
the tracking lines can be adjusted based on users' preferences.
across various devices. It simplifies the process of
creating AR applications by providing a unified API that
can be used on both Android and iOS platforms. One of
the key components of the AR Foundation framework is
AR-Core, which is Google's platform for building
augmented reality experiences on Android devices. AR-
Core enables developers to create applications that utilize
features like motion tracking, environmental
understanding, and light estimation, allowing for realistic
and immersive AR experiences on Android smart-phones
and tablets. Here AR coreSDK is used for
implementation. Combining Unity engine and AR
foundation is effective for an AR application build. AR Fig.3. Graphical User Interface
session component is used for enabling and disabling an
AR system. AR session origin is used to position the user C. Real-Time Implementation
in the given field and also used to alter the scale and The Targets to be tracked are created and displayed
orientation. AR camera is directly appointed as the user. separately as a virtual tag of the real-time location. The AR
camera component is the user and the navigation from the start
point toward the targets is visualized with the help of line
renderers. The line renderer can be visualized in real-time space.
The target locations can be changed with the help of the
dropdown UI provided in the canvas and the Location of the
user can also be seen with the help of the mini-map texture
render provided in the canvas.

Fig.2. Path finding bake using NAVMESH Agent


The environment model is created using Unity
assets and tagged separately based on their type which is
useful for AI navigation-mesh baking. NAVMESH agent
is used as path finding framework. According to the
NAVMESH data the AR line is drawn in the provided
model. Line renderers that are used as the tracking lines
will penetrate using the NAVMESH data by
differentiating accessible un-accessible areas and find the
shortest path to the desired target. The walls are assigned
with spatial mapping materials making them invisible in
the camera for better visualization.
B. User interface
The outcome of this system is to build an AR virtual
navigation tracking element that interact with the real-time
Fig.4. Virtual Target Tag in real-time
space which commands to provide virtual trackers from a
The distance between the user and the target is also
preferred user location to the provided destination using AI-
calculated and displayed on the screen. With the QR re-center,

979-8-3503-3360-2/23/$31.00 ©2023 IEEE 1698


Authorized licensed use limited to: COCHIN UNIVERSITY OF SCIENCE AND TECHNOLOGY. Downloaded on October 19,2024 at 04:01:07 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the International Conference on Sustainable Computing and Smart Systems (ICSCSS 2023)
IEEE Xplore Part Number: CFP23DJ3-ART; ISBN: 979-8-3503-3360-2
the position of the user is updated. A Separate function for line
SI MODEL ACCURAC
adjustment with preferred height measurements is provided in a NO. Y
separate UI panel. The users can adjust their preferred height of
the line. The line can be toggled ON/OFF with the help of 4 AR mapping(PROPOSED) 93.3%

another UI button. The application is built in android and the


mobile camera is used as an AR camera the virtual tracking and
elements are implemented in the real-time space for navigation. The application is tested inside the targeted block, the
respondents are asked to use the AR navigation application to
navigate from the start point and target and the satisfaction is
noted. A Set of questions is asked regarding the performance
and efficiency of the application and noted as a survey, from
higher to lower scale is categorized based on different age
groups. The most positive respondents are categorized as
strongly agreed and the negative respondents are categorized as
disagreed. Some have difficulties
TABLE 1: SURVEY QUESTIONS
Q1. I could clearly understand the directions shown in the app.
Q2. Will you use this app in the future when needed?
Q3. The placement of target location and trackers are efficient.
Q4. Interaction and manipulation are easy to perform.
Q5. The AR navigation system is more efficient than the
currentsystem.

6
Fig.5. AR Navigation in real-time 5
IV. RESULT
Number of Respondents

4
In this proposed system, users navigate the provided
environment using AR & AI. For the first step, the system must 3
connect to an AR-supported device that has real-time
visualization support. Both the UI and the navigation AR line 2
renderer are displayed with the help of the device camera. The
1
navigation pathway is projected in the real-world environment
and it can be adjusted with the help of mobile UIs the height of 0
the tracking line can be adjusted accordingly. The destination Q1 Q2 Q3 Q4 Q5
that is to be navigated is also displayed with a virtual tag. The Survey Question number
targets can be selected from the dropdown UI provided in the STRONGLY AGREE STRONGLY DISAGREE DISAGREE
line options.
Fig.6. Survey Responses
The model is compared with existing methodologies and the
accuracy from various models is calculated mean accuracy is V. CONCLUSION
noted. The Mobile Campus Navigation Application with Augmented
Reality is indeed a unique and innovative application that offers
SI MODEL ACCURACY a range of features and functionalities designed to enhance the
NO. user experience. By combining technical requirements with user-
1 Wifi based Positioning 80% friendly design principles, the application aims to improve user
productivity and provide a seamless navigation experience. One
2
Simultaneous Localization and Mapping
90% of the standout features of the application is its indoor navigation
(SLAM)
system utilizing augmented reality (AR) technology. By
3 RFID based positioning 87%
leveraging the power of AR, the application can overlay target
information onto the real-world camera view, making it easier
for users to navigate their surroundings. This feature

979-8-3503-3360-2/23/$31.00 ©2023 IEEE 1699


Authorized licensed use limited to: COCHIN UNIVERSITY OF SCIENCE AND TECHNOLOGY. Downloaded on October 19,2024 at 04:01:07 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the International Conference on Sustainable Computing and Smart Systems (ICSCSS 2023)
IEEE Xplore Part Number: CFP23DJ3-ART; ISBN: 979-8-3503-3360-2
[8] J. Jacobe, L. Jacobo, K. Salinas, P. Castañeda, and N. Moggiano,
allows users to see directions, points of interest, and other "Mobile Application Based on Augmented Reality to Encourage
relevant information in real-time, enhancing their Tourism at the Wari Willka Museum," in 2021 International
understanding of their environment. A crucial aspect of Conference on Information Systems and Advanced Technologies
(ICISAT), 2021.
the application is the calculation and display of the
[9] . Bakhmut, N. Kunanets, V. Pasichnyk, O. Artemenko, and I.
distance between the user and their destination. This Tsmots, "Using Augmented Reality WEB-Application for Providing
information is presented on the user interface canvas, Virtual Excursion Tours in University Campus," in 2021 IEEE 16th
International Conference on Computer Sciences and Information
enabling users to gauge their proximity to their desired
Technologies (CSIT),2021,
location accurately. It's important to note that the distance
[10] Schmetz, M. Bellgardt, V. Wehrwein, T. Kuhlen, and C. Brecher,
calculation may vary depending on the user's starting "Touch- based Augmented Reality Marking Techniques on
point, accommodating changes in the user's position Production Parts," Procedia CIRP, vol. 81, 2019.
within the indoor space. [11] Arizona Mobile,University of Arizona.
https://it.arizona.edu/service/arizona-mobile
To further enhance the application, ongoing research [12] Oregon State University Mobile Application,Oregon State
should focus on improving its features and exploring ways University.https://ecampus.oregonstate.edu/mobile/
to locate the user without the need for external barcodes or [13] MIT Campus Navigation Tool, Stefan Gobel
https://books.google.co.in/books?id=bE7DVaIcc4kC
markers. By developing advanced localization techniques,
[14] T.;Diaz,P.; Santos, BS “ Augmented Reality-Based Visualizations in
such as utilizing Wi-Fi signals or Bluetooth beacons, the Decision Making”.multimedia.Toolapplication.2022,81,14749-14772.
application can offer a more seamless and accurate user [15] "AR Core-Google Developers Google Developers" [Online].
experience. Another area for improvement is handling Available: https://developers.google.com/ar. [Accessed 28 January
large indoor environments like malls, airports, and railway 2023].
stations. Implementing features that can segregate speed [16] Hartmann, N. Link, and G. F. Trommer, “Indoor 3D position
estimation using low-cost inertial sensors and marker-based video
and distance data can enable the application to prioritize tracking,” Record -IEEE PLANS, Position Location and Navigation
nearby targets effectively. This could involve algorithms Symposium, pp. 319– 326, 2010.
that analyze user movement patterns and preferences to [17] K. Liu, G. Motta, T. Ma, and T. Guo, “Multi-floor Indoor Navigation
with Geomagnetic Field Positioning and Ant Colony Optimization
provide relevant and timely information, ensuring a Algorithm,”2018.
smoother navigation experience within expansive indoor
[18] M. Baldauf, M. Salo, S. Suette, and P. Fröhlich, ‘‘Display pointing:
spaces. In conclusion, the Mobile Campus Navigation A qualitative study on a recent screen pairing technique for smart
Application with Augmented Reality has significant phones,’’IEEE, vol. 7, 2019.
potential for enhancing user productivity and simplifying [19] Xin Hui Ng, Woan Ning Lim “Design of a Mobile Augmented
Reality- based Indoor Navigation System” (2020) IEEE 4th Global
navigation within complex indoor environments. By Conference onLife Sciences and Technologies.
continually refining its features and addressing challenges [20] Laura Ruotsalainen , Aiden Morrison, Maija Mäkelä ”Improving
such as user localization and efficient target allotment, the Computer Vision-Based Perception for Collaborative Indoor
application can provide a robust and user- friendly solution Navigation”(2022) IEEE SENSORS JOURNAL.in the beginning but
turned out to bepositive are marked as agreed.
for a wide range of users.

VI. REFERENCE
[1] Xiao qing-Chen, Shao wei Wang. “The Virtual-Reality-Technology
application in education”.Software Tribune2011,12,76~78
[2] Xin HuiNg Woan Ning Lim, "Designing a mobile augmented
reality-based indoor navigation system” IEEE 4th Global
Conference on Life Sciences and Technologies (2020).80–84
[3] Laura Ruotsalainen, Aiden Morrison, Maija Mäkelä “Improving
Computer Vision-Based Perception for Collaborative Indoor
Navigation ”(2022) IEEE SENSORS JOURNAL.
[4] Lehui Huang Bin Gui, “Study on application of product based on
Unity3D”. International Symposium on Computers & Informatics
(ISCI2015).
[5] Burton E.P. Frazier W. Annetta L. Lamb
R.,ChengR.,&ChmielM.“Modeling an augmented reality game
using Pre- service”. Jl. Research in Technology and Teacher
Education (2011),19(3),303-329
[6] M.Salo,S.Suette ,M.Bardorf & P.Fröhlich“ Display Pointing: A
qualitative study on the updated screen pairing technologies for
device” IEEE、vol.7 2019 .
[7] Y.Gu D.Li,Y.Kamiya,andS.Kamijo,"Integration of Positioning
and Activity Contextual Information for Urban Life logging",
Navigation,vol.67,No.1,p.163-179,2020.

979-8-3503-3360-2/23/$31.00 ©2023 IEEE 1700


Authorized licensed use limited to: COCHIN UNIVERSITY OF SCIENCE AND TECHNOLOGY. Downloaded on October 19,2024 at 04:01:07 UTC from IEEE Xplore. Restrictions apply.

You might also like