Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
41 views51 pages

AR in Education: Mini Project Report

Uploaded by

VK LAKSHMANDEV
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views51 pages

AR in Education: Mini Project Report

Uploaded by

VK LAKSHMANDEV
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

AUGMENTED HORIZONS OF EXPLORING NEW REALITIES IN

EDUCATION
MINI PROJECT REPORT

Submitted by

ABHIJITHA K 21DI01
KIRAN TS 21DI14
LAKSHMANDEV VK 21DI15
RAJARAJAN C 21IH12

under the guidance of


Mr. A KATHIRESAN
In partial fulfillment of the requirement for the award of

DIPLOMA IN INFORMATION TECHNOLOGY


STATE BOARD OF TECHNICAL EDUCATION
GOVERNMENT OF TAMILNADU

OCTOBER 2023

DEPARTMENT OF INFORMATION TECHNOLOGY


PSG POLYTECHNIC COLLEGE
(Autonomous and an ISO 9001: 2015 certified Institution)
COIMBATORE – 641 004
PSG POLYTECHNIC COLLEGE
(Autonomous and an ISO 9001: 2015 certified Institution)

DEPARTMENT OF INFORMATION TECHNOLOGY

COIMBATORE – 641 004

CERTIFICATE
ABHIJITHA K 21DI01
KIRAN TS 21DI14
LAKSHMANDEV VK 21DI15
RAJARAJAN C 21IH12

This is to certify that the Mini Project report entitled

AUGMENTED HORIZONS OF EXPLORING NEW REALITIES IN EDUCATION


has been submitted by

ABHIJITHA K
KIRAN TS
LAKSHMANDEV VK
RAJARAJAN C

In partial fulfillment for the award of

DIPLOMA IN INFORMATION TECHNOLOGY


of the State Board of Technical Education,
Government of Tamil Nadu during the academic year 2023

Mr. A Kathiresan Mr. A Kathiresan


Faculty guide HoD In-charge

Certified that the candidate was examined by us in the Mini Project viva- voce examination held on
……………

INTERNAL EXAMINER EXTERNAL EXAMINER

II
ACKNOWLEDGEMENT

First and foremost, I would like to thank the Almighty God for giving us the strength,
knowledge, ability, and opportunity to undertake this project study and to persevere and
complete it with satisfaction.

We are ineffably indebted to our principal for giving us this opportunity and encouraging us to
accomplish this project.

We are highly indebted to Mr. A Kathiresan, for his valuable guidance and constant supervision.
Without his able guidance, this project would not have been possible and we shall eternally be
grateful to his for his assistance.

We acknowledge with a deep sense of reverence, our special gratitude towards our Head of the
Department Mr. A. Kathiresan, Department of Information Technology for his guidance,
inspiration, and suggestions in our quest for knowledge.

We would like to express our special gratitude and thanks to the special machines laboratory
and technicians for giving us such attention and time.

We would like to express our gratitude towards our parents for their tremendous contribution in
helping us reach this stage in our life. This would not have been possible without their unwavering
and unselfish love, cooperation, and encouragement is given to us at all times.

We have taken efforts in this project. However, it would not have been possible without the kind
support and help of many individuals. We would like to extend our sincere thanks to all of them.

Any omission in this brief acknowledgment does not mean a lack of gratitude.

III
ABSTRACT

Augmented Reality (AR) has emerged as a transformative force in education, revolutionizing


traditional learning paradigms. This abstract introduces an innovative educational application
developed with Unity, harnessing AR's potential to create an immersive and interactive learning
environment.

The project's primary goal is to augment the learning process by seamlessly integrating digital
information into the physical world. Leveraging Unity's powerful development environment,
the application aims to provide learners with a dynamic three-dimensional educational
experience.

Methodologically, Unity is chosen for its versatility and widespread adoption, facilitating the
seamless integration of AR elements across platforms. The development process focuses on
creating 3D models, animations, and interactive elements, meticulously aligned with specific
learning objectives.

Marker-based AR, implemented through AR Foundation or Vuforia, enables the recognition


and tracking of physical markers, triggering the overlay of digital content. Additionally, the
application emphasizes user interaction, incorporating gestures, touch input, and voice
commands to engage learners in a multi-modal learning experience.

Educational scenarios encompass a wide range of possibilities, such as historical


reconstructions, allowing students to virtually explore landmarks, scrutinize architectural
details, and study ancient artifacts with unprecedented depth.

In summary, this Unity-based AR application showcases the transformative potential of AR


technology in education. By seamlessly merging digital and physical worlds, it introduces a
new era of immersive and interactive learning experiences, reshaping the landscape of
knowledge acquisition.

IV
TABLE OF CONTENTS

TOPICS PAGE.NO
Certificate Page ………....………………………………………………… II
Acknowledgement ………………………………………………………… III
Abstract ...………………………………………………………………....... IV
Table of Contents.…………………………………………………………. V
List of Figures ……...………………………………………………………. VIII

1. INTRODUCTION ….….….….….….….….….….….….….…. 01

1.1 Introduction To Project 01

1.2 Objective 02

1.3 Problem Statement 04

1.4 Summary 04

2. LITERATURE REVIEW ….….….….….….….….….….….………... 04

2.1 Literature Review 04

2.2 Summary 06

3. SYSTEM REQUIREMENTS …….…………………………………... 06

3.1 Hardware Requirements 06

3.2 Software Requirements 08

3.2.1 Software Description 09

3.2.1.1 Unity 09

3.2.1.2 Blender 11

3.3 Functional and Non-Functional Requirements 13

3.4 Summary 15
V
4. SYSTEM DESIGN ...……...……………………………............................................. 16

4.1 Existing System 13

4.1.1 Advantages of Existing System 15

4.1.2 Drawback of Existing System 16

4.2 Proposed System 17

4.3 Data Flow Diagram 22

4.4 Summary 23

5. IMPLEMENTATION ……………………………………………………………. 24

5.1 Working of 3D Model 24

5.2 Working In Unity 25

5.2.1 Unity In Phone 27

5.3 Summary 28

6. TESTING …………………………………………………………………………………. 19

6.1 Testing Levels 29

6.1.1 Unit Testing 29

6.1.2 Integration Testing 29

6.1.3 System Testing 29

6.1.4 Acceptance Testing 30

6.1.5 Performance Testing 30

6.2 Summary 30

VI
7. CODING AND OUTPUT ………………………………………………… 32

7.1 Multi Target Image Processing 32

7.1.1 Coding 32

7.1.2 Description 33

7.2 Output 34

7.2.1 Solar System 34

7.2.2 Sonometer 38

PROJECT DEVELOPMENT……………………………………………. 40

8. CONCLUSION……………………………………………………………. 41

FUTURE SCOPE ………………………………………............................... 42

BIBLIOGRAPHY ……………………………………….............................. 43

VII
LIST OF FIGURES

FIGURE NO NAME OF THE FIGURE PAGE NO


08
3.1 UNITY
10
3.2 UNITY AR FOUNDATION

11
3.3 BLENDER

21
4.1 BLOCK DIAGRAM

22
4.2 DFD 0 LEVEL

34
7.1 EARTH

35
7.2 URANUS

35
7.3 MARS

7.4 MERCURY 36

7.5 NEPTUNE 36

7.6 SUN 37

7.7 SATURN 37

7.8 VENUS 38

7.9 SONOMETER 38

7.10 SONOMETER 39

7.11 SONOMETER 39

VIII
Introduction Chapter 1
CHAPTER 1
INTRODUCTION

1.1 INTRODUCTION TO PROJECT

Education has always been a dynamic field, constantly evolving to incorporate new
technologies and methodologies that enhance the learning process. In recent years, one such
groundbreaking technology, Augmented Reality (AR), has emerged as a powerful tool with the
potential to revolutionize education. Augmented Reality, a blend of digital and physical worlds,
overlays digital content onto the real world through devices like smartphones, tablets, or AR
glasses. This technology introduces a new dimension to learning, offering immersive,
interactive, and engaging experiences for students of all ages and across various disciplines.

AR's potential in education is boundless, offering a diverse range of applications that can be
tailored to cater to different learning styles and objectives. By seamlessly integrating virtual
elements into the physical environment, AR bridges the gap between theoretical knowledge
and real-world applications, providing students with a unique opportunity to explore, interact,
and understand complex concepts in a more tangible and memorable way.

One of the key advantages of AR in education is its ability to make abstract or complex subjects
more accessible. For instance, in biology classes, students can use AR to dissect virtual
organisms, gaining a deeper understanding of anatomical structures and biological processes
without the need for physical specimens. Similarly, in physics, AR simulations can bring
complex physical phenomena to life, allowing students to experiment with concepts like
gravity, motion, and electricity in a controlled, interactive environment.

Furthermore, AR fosters collaborative learning experiences. By enabling multiple users to


interact with the same augmented content simultaneously, students can engage in group
activities, discussions, and problem-solving exercises, promoting teamwork and enhancing
communication skills. This collaborative aspect of AR not only mirrors real-world professional
environments but also prepares students for future careers that increasingly rely on teamwork
and interconnectivity.

1
Introduction Chapter 1

AR also addresses the diverse learning needs of students by providing customizable


experiences. Educators can adapt AR content to cater
to different learning styles, ensuring that visual, auditory, and kinesthetic learners all have
opportunities to excel. For example, a history lesson on ancient civilizations could incorporate
AR reconstructions of ancient cities, allowing visual learners to explore, auditory learners to
listen to historical narratives, and kinesthetic learners to physically interact with virtual
artifacts.

Beyond traditional classroom settings, AR extends the boundaries of education by enabling


immersive field trips and experiential learning. Students can virtually visit historical
landmarks, explore ecosystems, or even travel to distant planets, all from the comfort of their
classrooms. This not only broadens their horizons but also makes learning more inclusive and
accessible to those who may face physical or financial barriers to traditional field trips.

1.2 OBJECTIVE

The primary objective of integrating Augmented Reality (AR) into education is to enhance
and enrich the learning experience for students across various subjects and disciplines. This is
achieved through several key goals

Enhanced Engagement: AR captivates students' attention by providing interactive and


immersive content, making learning more engaging and enjoyable. This heightened
engagement encourages active participation, which is crucial for effective learning.

Improved Understanding of Complex Concepts: AR facilitates the visualization of abstract


or complex ideas, allowing students to interact with and manipulate virtual objects or
environments. This hands-on experience aids in comprehending challenging concepts by
providing a tangible, real-world context

Fostering Critical Thinking and Problem-Solving Skills: AR encourages students to think


critically, analyze information, and solve problems in dynamic, interactive environments. It
promotes a deeper level of understanding and enables students to apply their knowledge in
practical scenarios.

2
Introduction Chapter 1

Customized Learning Experiences: AR technology can be tailored to accommodate diverse


learning styles and preferences. It offers flexibility in content delivery, allowing
educators to adapt materials to suit individual student needs, ensuring a more inclusive and
effective learning environment.

Facilitating Collaborative Learning: AR enables group activities and collaborative projects,


fostering teamwork and communication skills. Students can work together to solve problems,
share ideas, and learn from one another, mirroring real-world collaborative scenarios.

Accessible Experiential Learning: AR brings virtual field trips, simulations, and experiments
into the classroom, making experiential learning more accessible and inclusive. Students can
explore places and scenarios that would otherwise be impractical or impossible to visit
physically.

Preparation for Future Careers: By exposing students to cutting-edge technology, AR equips


them with skills and experiences that are increasingly relevant in today's technology- driven
job market. It helps bridge the gap between classroom learning and real-world applications.

Inspiring Curiosity and Lifelong Learning: AR stimulates curiosity and a sense of wonder
in students. It encourages exploration, inquiry, and a desire to seek out knowledge
independently, fostering a lifelong love for learning.

Adaptability and Future-Readiness: AR is a rapidly evolving technology, and its integration


into education prepares students to adapt to new technological advancements. It instills a sense
of adaptability and a willingness to embrace innovative tools and methodologies.

Measurable Learning Outcomes: AR in education allows for the collection of data on student
interactions and performance. This data can be used to assess learning outcomes, identify areas
for improvement, and tailor instructional approaches for better results.

3
Introduction Chapter 1

1.3 PROBLEM STATEMENT:

The integration of Augmented Reality (AR) in education faces multifaceted challenges. Educational
institutions often struggle with the limited integration and accessibility of AR due to inadequate infrastructure
and technical expertise. The absence of standardized frameworks for AR content creation contributes to
variations in quality and consistency across educational materials. Cost and resource constraints pose
significant barriers, creating a digital divide where some students have access to AR-enhanced learning while
others do not. Teacher training and acceptance are crucial, with a lack of comprehensive programs hindering
educators from effectively incorporating AR into their teaching methodologies. Ensuring the relevance and
alignment of AR content with educational objectives remains a persistent challenge, and privacy concerns
related to data security and ethical considerations further complicate implementation. Additionally, the
potential for AR to widen socioeconomic disparities and the need for standardized metrics to evaluate its
effectiveness underscore the complex landscape surrounding AR integration in education. Addressing these
challenges requires collaborative efforts to create a supportive ecosystem for the successful and equitable
implementation of AR in educational settings.

1.4 SUMMARY:

Augmented Reality (AR) in education represents a transformative integration of digital and physical
learning environments, enriching educational experiences by overlaying virtual information onto the real
world. By leveraging AR technologies, educators can create immersive and interactive content, allowing
students to engage with educational materials in innovative ways. AR in education enhances visualization
of complex concepts, enabling students to explore subjects like science, history, and mathematics through
interactive 3D models and simulations. It fosters a dynamic and student-centered learning environment,
catering to diverse learning styles. Moreover, AR provides opportunities for collaborative learning and real-
world application of knowledge. As AR technologies continue to evolve, their potential to revolutionize
education by fostering engagement, critical thinking, and curiosity is increasingly evident, opening new
horizons for interactive and personalized learning experiences.

4
Literature Review Chapter 2

CHAPTER 2
LITERATURE REVIEW

2.1 LITERTURE REVIEW:

Title: "Augmented Reality in Education and Training: Pedagogical Approaches and Illustrative
Case Studies"

Authors: Parsons, D., & Caris, M.

Publication Year: 2014

Journal: Journal of Educational Technology Systems

Abstract: Parsons and Caris provide a comprehensive exploration of pedagogical approaches


and present illustrative case studies that demonstrate the integration of AR in education and
training. They emphasize how AR enhances learning experiences by providing interactive and
engaging content, leading to deeper understanding and retention of educational material.

Title: "Augmented Reality: An Overview and Five Directions for AR in Education"

Authors: Dunleavy, M., & Dede, C.

Publication Year: 2014

Journal: Journal of Educational Technology Research and Development

Abstract: This review offers a comprehensive overview of AR technology and suggests five
pivotal directions for its application in education. The authors highlight how AR has the
potential to revolutionize educational practices by providing immersive, interactive, and
contextual learning experiences, ultimately leading to improved learning outcomes.

Title: "A Review on the Use of Augmented Reality in Education: From the Perspective of
Motivation, Cognitive Load, Presence, and Practical Implementation"

Authors: Bacca, J., Baldiris, S., Fabregat, R., Graf, S., & Kinshuk

Publication Year: 2014

Journal: Educational Technology & Society

Abstract: Bacca et al. consider crucial factors such as motivation, cognitive load, and presence
when evaluating the effectiveness of AR in education. The review discusses practical

5
Literature Review Chapter 2

implementation strategies and offers a nuanced understanding of how AR influences the


learning environment. It highlights the potential of AR to create immersive and interactive
learning experiences.

Title: "Augmented Reality in Education: A Review"

Authors: Azuma, R. T.

Publication Year: 2013

Journal: ISMAR

Abstract: Azuma's comprehensive review provides an in-depth examination of AR


technology's applications and benefits in education. The review emphasizes how AR can be
harnessed to create dynamic and engaging learning experiences, allowing learners to interact
with virtual objects and environments. Azuma argues that AR has the potential to significantly
enhance educational practices and improve learning o

2.2 SUMMARY:

The literature survey on Augmented Reality (AR) in education reveals a growing body of
research and practical applications that highlight the transformative impact of AR on the
learning experience. Studies consistently emphasize the potential of AR to enhance student
engagement, understanding, and retention of educational content. AR is found to be particularly
effective in science, technology, engineering, and mathematics (STEM) subjects, providing
interactive simulations and 3D visualizations that facilitate deeper comprehension.
Additionally, literature suggests that AR fosters a student-centric learning environment,
catering to diverse learning styles and encouraging active participation. Challenges such as
hardware limitations, integration into curricula, and the need for teacher training are also
acknowledged. Overall, the literature survey underscores the significant role of AR in
reshaping education by offering innovative and immersive learning opportunities that go
beyond traditional teaching methods. As the field of AR in education continues to advance,
research and practical implementations contribute to a growing understanding of its benefits,
challenges, and the potential for revolutionizing the educational landscape.

6
System Requirements Chapter 3

CHAPTER 3
SYSTEM REQUIREMENTS

3.1 HARDWARE REQUIREMENTS:

Development Machine:

• Processor: Intel Core i7 or equivalent


• RAM: 16GB or higher
• Graphics Card: NVIDIA GTX 1060 or AMD Radeon RX 480 or higher
• Storage: SSD for faster data access

AR Device:

Depending on the target platform (smartphones, tablets, AR glasses), choose


hardware with AR capabilities. For example, ARKit for iOS devices, ARCore for
Android.

3.2 SOFTWARE REQUIREMENTS:

Development Environment:

Integrated Development Environment (IDE) such as:


• For Android: Android Studio with ARCore support
• For iOS: Xcode with ARKit support
• For cross-platform: Unity 3D with AR Foundation or Unreal Engine with
ARCore/ARKit support

Operating System:

• Windows (for Android development)


• macOS (for iOS development)

AR SDKs/Frameworks:

• ARKit (for iOS)


• ARCore (for Android)
• AR Foundation (Unity)
• Unreal Engine with ARKit/ARCore support

Programming Languages:

7
System Requirements Chapter 3

• For iOS: Swift or Objective-C


• For Android: Kotlin or Java (Java is required for ARCore support)
• For cross-platform: C# (Unity) or C++ (Unreal Engine)

Version Control:

• Git for source code management and collaboration

Additional Tools and Libraries:

3D Modeling and Animation Software

• Blender, Maya, 3ds Max, or other 3D modeling tools for creating assets.

Graphics Editing Software:

• Adobe Photoshop, GIMP, or similar for creating textures and UI elements.

AR Content Creation:

• Tools for creating AR content, including 3D models, animations, and AR


markers.

Security and Privacy Considerations:

Data Security:

• Implement secure data handling practices, including encryption and secure


authentication.

Privacy Compliance:

• Ensure compliance with data protection regulations (e.g., GDPR, HIPAA,


etc.).

Documentation and Support:

User Documentation:

• Provide user manuals or guides for navigating and using the AR application.

Developer Documentation:

• Document codebase, APIs, and any custom features for future maintenance
and development.

8
System Requirements Chapter 3

3.2.1 SOFTWARE DESCRIPTION


3.2.1.1 UNITY:

Unity Engine is a versatile and user-friendly platform renowned for game and
interactive application development. Its cross-platform capabilities streamline development
across various devices and operating systems. With support for multiple languages, including
C#, Unity offers a flexible coding environment. The Unity Asset Store provides a wealth of
pre-made assets and tools, expediting the development process. Whether crafting 2D games or
complex 3D simulations, Unity's toolset is adaptable to various dimensions. Its robust physics
engine and animation systems allow for lifelike movements and interactions. With advanced
graphics features and dedicated frameworks for VR and AR, Unity is a go-to choose for
immersive experiences and modern interactive applications.

Fig No:3.1 Unity

UNITY IN AR:

Unity is a powerful engine for developing AR (Augmented Reality) applications due to its
specialized framework called Unity AR Foundation. Here's how Unity works in AR
applications:

9
System Requirements Chapter 3

AR Foundation Integration:
Unity's AR Foundation is a package that streamlines AR development by providing a unified
API for working with various AR platforms, including ARKit (iOS) and ARCore (Android).
This allows developers to write code once and deploy it across multiple AR-compatible devices.

AR-Capable Devices:
Unity AR applications are deployed on devices equipped with AR-capable hardware. These can
be smartphones, tablets, or AR glasses that support ARKit or ARCore.

Scene Creation:
Developers use Unity's intuitive interface to create the AR scene. They can import 3D models,
animations, textures, and other assets to build the virtual environment where the AR experience
will take place.

AR Camera Setup:
Unity provides a specialized AR Camera component that replaces the standard camera used in
traditional games. This camera is designed to capture the real-world environment and integrate
virtual elements seamlessly.

AR Tracking and Detection:


AR Foundation handles the tracking of real-world objects and surfaces. It uses the device's
camera to identify and understand the geometry of the physical environment, allowing virtual
objects to interact with it.

Placement of Virtual Objects:


Using Unity's scripting capabilities (usually in C#), developers can program the behavior of
virtual objects. These objects can be anchored to real-world surfaces or positioned in specific
locations within the AR environment.

User Interaction:
Unity enables developers to implement interactions between users and virtual objects. This can
include gestures, touch controls, or even voice commands, allowing users to manipulate and
engage with the AR content.

10
System Requirements Chapter 3

Graphics and Rendering:


Unity's powerful rendering engine ensures that virtual objects are seamlessly integrated into the
real-world environment. This includes considerations for lighting, shadows, and reflections to
make the AR experience as realistic as possible.

AR User Interface (UI):


Unity provides tools for creating user interfaces within the AR environment. This allows for the
display of information, menus, and interactive elements that enhance the user experience.

Cross-Platform Compatibility:
AR Foundation, developers can write code that is compatible with both ARKit and ARCore,
making it easier to deploy AR applications on both iOS and Android devices.

Testing and Debugging:


Unity's integrated development environment (IDE) includes robust tools for testing and
debugging AR applications. Developers can simulate AR experiences on their computers and
test them on real devices.

Deployment:
Once the AR application is developed and thoroughly tested, it can be built for various target
platforms, such as iOS and Android, and deployed through their respective app stores.

Fig No:3.2 Unity AR Foundation

3.2.1.2 BLENDER:
Blender is a versatile and open-source 3D modeling and animation software renowned for
its robust capabilities. It empowers users to create intricate 3D models, animate characters, and

11
System Requirements Chapter 3

simulate dynamic environments. Its intuitive interface and extensive documentation make it
accessible to professionals and novices alike. With real-time rendering and an integrated game
engine, Blender caters to game developers and filmmakers, offering a comprehensive suite of
tools. Moreover, its scripting capabilities enable customization, enhancing its adaptability for
various projects. From architectural visualization to visual effects, Blender stands as a cost-
effective and powerful solution for 3D content creation.

Fig No:3.3 Blender

WORKING IN BLENDER:
Blender 3D is a powerful open-source 3D creation suite that is used for various purposes,
including 3D modeling, animation, rendering, and even game development. If you're working
in Blender 3D, you can do a wide range of tasks, from creating 3D models to producing
animated films or games. Here are some common tasks and features you might encounter while
working in Blender 3D:

3D Modeling: You can use Blender to create 3D models of objects, characters, and
environments. This includes extruding, sculpting, and texturing objects to make them look
realistic.

Animation: Blender supports keyframe animation, rigging, and character animation. You can
animate objects, characters, and even create complex character rigs.

Rendering: Blender has a built-in rendering engine, Cycles, and Eevee for real-time rendering.
You can create stunning visuals and render animations or still images.

12
System Requirements Chapter 3

Texturing and Materials: You can apply textures and materials to your 3D models, making
them look like different materials (e.g., wood, metal, or glass).

Lighting: Set up and control various types of lighting in your scenes to achieve the desired
atmosphere and visual effects.

Physics Simulations: Blender offers physics simulations for smoke, fluid, cloth, and more.
You can create realistic physical interactions in your scenes.

Compositing: Blender includes a node-based compositor, allowing you to post-process your


renders, add effects, and manipulate the final output.

Video Editing: You can use Blender for video editing and post-production work, including
cutting, splicing, and adding effects to videos.

Game Development: Blender has a game engine that allows you to create interactive 3D
games. You can build, animate, and script your game assets directly in Blender.

3D Printing: Blender has tools to prepare 3D models for 3D printing. You can check and fix
models for printability.

Scripting: Blender has a Python scripting API that enables you to automate tasks, create
custom tools, and extend its functionality.

Add-Ons: You can extend Blender's capabilities by installing various add-ons created by the
community or developing your own.

3.3 FUNCTIONAL AND NON-FUNCTIONAL REQUIREMENTS

Functional Requirements:
Image Recognition: The AR system should be able to recognize and track images in real-time.
Implement image recognition algorithms to identify predefined images or patterns.

AR Interaction: Integrate AR Foundation to enable the placement of virtual objects in the real
world based on image recognition. Implement interactive features, such as tapping or dragging

13
System Requirements Chapter 3

virtual objects in response to recognized images.

Marker less Tracking: Include marker less tracking capabilities to enable AR


experiences without the need for predefined markers.

Object Manipulation: Allow users to manipulate virtual objects through gestures or


touch interactions.

Scene Understanding: Implement scene understanding to recognize the environment and


adapt AR content accordingly.

Cross-Platform Compatibility: Ensure the AR application works seamlessly on various


platforms, such as iOS and Android, using AR Foundation.

User Interface (UI): Design and implement a user-friendly interface for configuring and
interacting with AR features.

Performance Optimization: Optimize the application for smooth performance,


considering factors like frame rate, rendering, and responsiveness.

Non-Functional Requirements:
Performance: The AR application should run smoothly with minimal latency to provide
a seamless user experience.

Scalability: Design the application to handle varying levels of complexity and a growing
number of users or data.

Reliability: Ensure the system is reliable and stable, with minimal crashes or unexpected
behavior.

Security: Implement security measures to protect user data and ensure the safe operation of
the AR application.

Usability: The user interface should be intuitive and easy to navigate, promoting a positive
user

14
System Requirements Chapter 3

experience.

Compatibility: Ensure compatibility with a range of devices, considering different screen sizes,
resolutions, and hardware capabilities.

Accessibility: Design the application to be accessible to users with different abilities, considering
factors like color contrast and text size.

Documentation: Provide comprehensive documentation for developers, detailing the usage of


AR Foundation features and any custom functionalities.

Update and Maintenance: Plan for easy updates and maintenance of the AR application,
allowing for future improvements and bug fixes.

3.4 SUMMARY:

The collaborative workflow between Blender and Unity in the context of AR Foundation

involves a seamless integration of 3D content creation and game development. Blender, a

versatile open-source 3D graphics software, serves as a comprehensive tool for modeling,

sculpting, and animating objects and scenes. Artists use Blender to craft detailed 3D assets,

characters, and animations. Unity, a popular game development engine, complements Blender

by providing a platform for integrating these assets into AR applications using AR Foundation.

This workflow enables developers to import Blender-created assets into Unity, where they can

be arranged, scripted, and deployed for augmented reality experiences. Unity's AR Foundation

extends its capabilities to incorporate AR features, such as image recognition and tracking. The

synergy between Blender and Unity within the AR Foundation framework facilitates a

streamlined process for creating immersive AR content, showcasing the collaborative power of

these tools in reshaping interactive and visually compelling augmented reality experiences.

15
System Design Chapter 4

CHAPTER 4
SYSTEM DESIGN

4.1 EXISTING SYSTEM

The existing system of an AR (Augmented Reality) application in education is a rapidly


evolving landscape that leverages cutting-edge technology to enhance learning experiences.
Several notable aspects define the current state of AR in education

Content Enrichment:

AR applications are used to augment textbooks and learning materials. By scanning specific
pages or markers with a mobile device, students can access additional multimedia content, such
as 3D models, videos, and interactive simulations, providing a deeper understanding of the
subject matter.

Virtual Labs and Simulations:

AR brings lab experiences to life by allowing students to conduct experiments virtually. For
example, in chemistry, students can interact with virtual chemical reactions, observe molecular
structures, and understand scientific concepts in a controlled digital environment.

Historical and Cultural Immersion:

AR applications transport students to different time periods and locations, allowing them to
explore historical sites, ancient civilizations, and cultural landmarks through immersive 3D
reconstructions and interactive experiences.

Interactive Educational Games:

AR-based educational games engage students in learning through interactive challenges,


quizzes, and puzzles. These games make the learning process enjoyable and promote active
participation.

Language Learning and Vocabulary Building:

AR applications for language education use visual cues in the real world to help learners
associate words with objects, facilitating vocabulary acquisition. Students can point their
devices at objects, and the application provides translations or audio pronunciations.

16
System Design Chapter 4

Geography and Environmental Education:

AR applications provide interactive maps and overlays that allow students to explore
geographical features, ecosystems, and environmental changes. This technology enables a
hands-on learning approach to understand the Earth's geography.

Art and Creativity:

AR applications in art education enable students to create and manipulate virtual sculptures,
paintings, and designs in a 3D space. This fosters creativity and allows for experimentation with
different artistic styles and techniques.

Accessible Field Trips:

AR applications simulate virtual field trips, making it possible for students to visit museums,
historical sites, and natural landmarks without leaving the classroom. This addresses
accessibility challenges and allows for inclusive learning experiences.

STEM Education:

AR applications play a significant role in Science, Technology, Engineering, and Mathematics


(STEM) education. They facilitate interactive learning experiences in physics, biology,
mathematics, and engineering disciplines.

Personalized Learning Experiences:

AR applications can be customized to cater to different learning styles and abilities. They adapt
content presentation and difficulty levels to meet individual student needs, providing a more
personalized learning experience.

Collaborative Learning:

AR applications support collaborative learning experiences. Multiple users can interact with the
same augmented content simultaneously, fostering teamwork, communication skills, and peer -
to-peer learning.

17
System Design Chapter 4

4.1.1 ADVANTAGES OF EXISTING SYSTEM

The existing system of Augmented Reality (AR) applications in education brings forth a host of
advantages that significantly enhance the learning experience. Here are some key benefits:

Enhanced Engagement: AR applications captivate students' attention by providinginteractive,


dynamic content that goes beyond traditional textbooks, making learning more engaging and
enjoyable.

Improved Understanding: AR helps students grasp complex and abstract concepts by


providing visual, interactive representations that facilitate a deeper understanding of the subject
matter.

Real-world Context: AR bridges the gap between theoretical knowledge and its real-world
applications. It allows students to apply what they've learned in practical, tangible settings.

Personalized Learning: AR applications can be tailored to cater to different learning styles and
paces, ensuring that each student receives a customized educational experience.

Multi-Sensory Learning: AR engages multiple senses, including visual, auditory, and


sometimes tactile, providing a holistic learning experience that reinforces memory retention.

Accessibility and Inclusivity: AR enables students to access educational content regardless of


physical or geographical limitations, making it an inclusive tool for learners with diverse needs.

Interactive Simulations and Experiments: AR allows students to conduct virtual experiments,


simulations, and activities that may be impractical, expensive, or dangerous in a traditional
classroom setting.

Motivation and Curiosity: The immersive nature of AR applications sparks curiosity and a
sense of wonder, motivating students to explore, ask questions, and seek answers independently.

Collaborative Learning: AR applications often support group activities and projects,


promoting teamwork, communication skills, and peer-to-peer learning.

Real-time Feedback and Assessment: AR applications can provide instant feedback on tasks,
quizzes, or assignments, allowing students to gauge their progress and adjust their approach
accordingly.

Preparation for Future Technologies: Familiarity with AR technology equips students with
skills relevant to the modern workforce, preparing them for careers in fields that increasingly

18
System Design Chapter 4

rely on augmented reality applications.

Cost-Efficiency: While the initial investment in AR technology may be a consideration, in the


long run, AR can reduce costs associated with physical resources, such as lab equipment or
printed educational materials.

Adaptability to Diverse Subjects: AR applications have versatile applications across various


disciplines, including science, history, art, mathematics, and more, making it a valuable tool in a
wide range of educational contexts.

Data-driven Insights: AR applications can collect data on student interactions, allowing


educators to assess learning outcomes, identify areas for improvement, andtailor instruction to
individual needs.

Future-Readiness: Embracing AR technology in education prepares students to adapt to and


embrace new technologies, a crucial skillset in a rapidly evolving digitallandscape.

4.1.2 DRAWBACK OF EXISTING SYSTEM

Cost: Implementing AR technology can be expensive, requiring investments in hardware,


software, and training.

Technical Requirements: AR applications rely on compatible devices with specifichardware


capabilities, potentially limiting accessibility for some students.

Learning Curve: Both educators and students may need time to become proficient in using AR
applications effectively, potentially requiring additional training.

Content Development: Creating high-quality AR content can be time-consuming and may


require specialized skills in 3D modeling and programming.

Distraction Potential: In some cases, AR may introduce additional sensory stimuli that could
potentially distract students from the intended learning objectives.

Dependency on Technology: Technical issues or device malfunctions can disrupt thelearning


process, highlighting a reliance on technology.

19
System Design Chapter 4

Limited Curriculum Integration: Integrating AR into existing curricula may require careful
planning and alignment with educational standards, and not all subjects may easily lend
themselves to AR applications.

Inequities in Access: Not all students may have access to the required devices, potentiallyleading
to disparities in access and opportunities.

4.2 PROPOSED SYSTEM

Introduction and Setup:


Augmented Reality (AR) has emerged as a transformative technology, and Unity AR Foundation
provides a robust framework for creating immersive AR experiences. Begin your AR app
development journey by setting up the development environment. Install Unity Hub and the latest
version of Unity, ensuring compatibility with AR Foundation. Use the Unity Package Manager to
add the AR Foundation package along with the ARKit and ARCore packages. These packageslay
the foundation for cross-platform AR experiences, supporting both iOS and Android devices.Once the
environment is set up, create a new Unity project and define the essential ARFoundation
components, such as ARSession and ARSessionOrigin, which manage the AR experience and
represent the tracked space in the real world.

Core Functionality, Interactions, and Deployment:


With the groundwork laid, focus on the core functionalities of your AR app. Enable AR plane
detection to identify surfaces in the real world, and implement the ARPlaneManager to handle
detected planes. Integrate ARRaycastManager for precise interactions within the ARenvironment,
allowing users to interact with the virtual elements overlaid on the real world. Forobject placement,
create or import 3D objects and employ raycasting or other intuitive methods.Utilize AR Anchors
to ensure virtual objects remain anchored in the physical space, providing astable and realistic user
experience.

Design an intuitive user interface (UI) with elements such as buttons and labels, enhancing user
engagement. Implement user interactions, such as tapping or dragging, to manipulate AR objects
seamlessly. Consider incorporating gestures and touch controls to make the experience more
immersive. Provide visual feedback to users for successful interactions, like highlighting selected
objects or displaying relevant information.

20
System Design Chapter 4

Testing is a critical phase in AR app development. Test your app on ARKit and ARCore
supported devices to identify and resolve any platform-specific issues. Debug and optimize your
app for performance, ensuring a smooth user experience. Configure build settings and deploy
your AR app to the App Store for iOS and Google Play for Android, making it accessible to a
broader audience.

In the iterative process, gather user feedback to enhance your AR app continually. Stay informed
about updates to AR Foundation, ARKit, and ARCore for potential improvements and new
features. Thoroughly document your code and provide clear instructions for users and future
developers, facilitating understanding and future development. As AR technology evolves, this
comprehensive approach ensures your AR app remains at the forefront of innovation and user
satisfaction.

Fig No:4.1 Block Diagram

21
System Design Chapter 4

4.3 DATA FLOW DIAGRAM:

A Data Flow Diagram (DFD) is a graphical representation of the flow of data within a system.
While AR applications, especially those using AR Foundation in Unity, might not have traditional data
flow in the same way as information systems, we can represent the flow of information and interactions
within the AR system. Here's a simplified DFD for an AR application using AR Foundation in Unity:

Fig No:4.2 DFD 0 Level

Description:

User Input:

Represents any input from the user, such as gestures, taps, or other interactions.

22
System Design Chapter 4

AR Foundation Module:

This module encompasses the AR Foundation framework in Unity, responsible for handling AR
functionalities like tracking, rendering, and interactions.

Image Recognition Algorithms:

This block represents algorithms responsible for recognizing images or patterns in the real-world
environment.

Virtual Objects Rendering:

Denotes the rendering of virtual objects in the AR scene based on the image recognition results.

4.4 SUMMARY:

The existing system may lack certain features, exhibit performance bottlenecks, or have
limitations in terms of user experience. In the context of AR, the image processing may be less accurate,
the interaction with virtual objects might be limited, and the overall system may not adapt well to
different environmental conditions. The need for enhanced features, improved performance, and a
more user-friendly interface becomes apparent through the limitations of the existing system.

The proposed system outlines the improvements and new features introduced to overcome the
shortcomings of the existing system. This may include advancements in image recognition algorithms
for more accurate tracking, enhanced AR interactions for users, improved rendering quality, and
adaptability to diverse real-world environments. Additionally, the proposed system could address
issues related to performance, security, and usability, providing a more robust and satisfying AR
experience. The introduction of new technologies, optimized code, and a refined user interface
contributes to the overall effectiveness of the proposed AR application.

In summary, the proposed system represents an evolution from the limitations of the existing
system, introducing advancements in image processing, AR interactions, and overall system
performance to deliver an upgraded and more feature-rich AR experience. The proposed system aims
to address user needs and expectations, providing a solution that not only overcomes existing
challenges but also sets the stage for future improvements and innovations in AR technology.

23
Implementation Chapter 5

CHAPTER 5
IMPLEMENTATION
5.1 WORKING OF 3D MODEL:

Setting the Stage in Blender

Blender, an open-source 3D creation suite, offers a versatile platform for modeling,


sculpting, and rendering. After installing Blender, you're greeted by a dynamic interface. The
3D viewport is central, allowing you to navigate with ease using the mouse and keyboard
shortcuts. Adding a mesh, such as a cube or sphere, is as simple as pressing Shift + A or using
the "Add" menu. Transitioning to Edit Mode provides granular control, enabling manipulation
of vertices, edges, and faces.

The transformative capabilities of Blender shine through in Edit Mode, where you can move,
rotate, and scale components using shortcuts like G, R, and S. Constrain these transformations
along specific axes by appending X, Y, or Z after the action. To refine the model's surfaces,
consider applying a Subdivision Surface modifier, accessible through the modifier panel. This
step introduces a level of smoothness, crucial for achieving realistic and visually appealing 3D
models.

In the "Shading" workspace, the creative process expands to materials and textures. Assign
materials to the model, and add textures for a more nuanced appearance. UV mapping, crucial
for precise texture application, involves unwrapping the model's UVs in Edit Mode and
adjusting them in the dedicated UV Editor. For those delving into more organic forms, Blender's
Sculpt Mode offers a suite of brushes for detailed sculpting, expanding the creative possibilities
beyond traditional geometric shapes.

Illumination, Cameras, and Rendering Mastery

As the 3D model takes shape, attention turns to lighting, a critical factor for achieving realism
in renders. In the "Layout" or "Rendering" workspace, lights are added and configured in the
"Object Data Properties" panel. Proper lighting emphasizes the model's details and establishes
a mood within the scene. With a lit stage set, the next step involves camera setup. Placing and
adjusting the camera's position, rotation, and focal length define the viewpoint for rendering.

24
Implementation Chapter 5

The rendering process in Blender involves choosing between the Eevee and Cycles rendering
engines, each offering unique strengths. Eevee excels in real-time rendering and is ideal for
quick previews, while Cycles focuses on accurate light interaction and is suitable for high-
quality final renders. Render settings are configured in the respective engines' panels, allowing
customization according to project requirements.

5.2 WORKING IN UNITY:

Foundations and Setup


Creating an Augmented Reality (AR) application in Unity using AR Foundation involves a
structured approach to leverage the capabilities of ARKit and ARCore. Begin by setting up
the development environment with Unity Hub and the latest Unity version. Once Unity is
installed, open Unity Hub, create a new project, and ensure that AR Foundation, ARKit, and
ARCore packages are added through the Unity Package Manager. These packages serve as the
backbone for cross-platform AR development, supporting both iOS and Android devices.
Establish a solid foundation by choosing the appropriate Unity version compatible with the
desired AR Foundation features.

With the environment configured, introduce AR Foundation's core components into the scene.
Drag and drop the AR Session prefab, which acts as the central manager for the AR experience,
and the AR Session Origin prefab, representing the tracked space in the real world. This initial
setup is pivotal for building an AR application that seamlessly integrates with the physical
environment. It forms the canvas upon which the AR experience will unfold, combining virtual
and real-world elements.

Interaction and Implementation


With the foundation in place, delve into the implementation of AR interactions. Implement
ARRaycastManager to enable raycasting, a fundamental technique for interacting with the AR
environment. Raycasting allows the app to identify surfaces, planes, or objects in the real
world, opening avenues for placing virtual objects precisely within the user's surroundings.
Explore the possibilities of object placement using raycasting or other interaction methods,
ensuring a seamless blending of the virtual and physical realms.

25
Implementation Chapter 5

Stability is paramount in AR applications. Integrate AR Anchors into the design to attach virtual
objects to real-world points, ensuring they remain fixed in space relative to the physical
environment. This step enhances the user experience, providing a sense of persistence and
stability to virtual elements within the dynamic context of the AR environment.

Designing the user interface (UI) is the next crucial step. Consider the user experience and
implement intuitive controls, buttons, or other interactive elements to facilitate engagement
with the AR content. Implementation of AR interactions involves incorporating gestures and
touch controls, making the user experience more natural and immersive. Visual feedback
mechanisms are essential for guiding users through the AR experience, providing cues for
successful interactions, or conveying information relevant to the virtual elements.

Testing, Deployment, and Iterative Refinement


The testing phase is pivotal to ensure the AR application's functionality and performance. Test
the app rigorously on ARKit and ARCore supported devices, addressing any platform-specific
issues that may arise. Debug and optimize the application for performance, taking into account
the varying capabilities of different devices. Consider user feedback gathered during testing to
iterate on the application, refining interactions, improving stability, and enhancing the overall
user experience.

As the application matures through testing and iteration, prepare it for deployment. Configure
build settings within Unity to target specific platforms and publish the AR app to the App Store
for iOS and Google Play for Android. Ensure that the app adheres to platform-specific
guidelines and requirements for a seamless deployment process.

Continuously stay informed about updates to AR Foundation, ARKit, and ARCore for potential
enhancements and new features. Regularly iterate on the application based on user feedback,
emerging technologies, and evolving best practices. Robust documentation of code and
application features facilitates future development and ensures that the AR application remains
at the forefront of innovation and user satisfaction.

26
Implementation Chapter 5

5.2.1 UNITY IN PHONE:

Setting Up Unity for Mobile Development

Embarking on the journey of enabling Unity on a phone involves a comprehensive set of steps
beginning with the installation of Unity Hub and a compatible Unity version. Unity Hub serves as
a centralized management tool for Unity versions, streamlining the process of project creation and
version control. Once a Unity version supporting the target platform—whether Android or iOS—
is installed, the creation of a new project unfolds within Unity Hub. This initial phase is crucial for
project setup, requiring decisions on templates, such as 3D or 2D, and meticulous configuration of
project settings.

As the development environment takes shape within the Unity editor, the subsequent step involves
crafting the desired application. This process entails importing assets, shaping scenes, and
implementing functionality through the use of C# scripts. Unity's intuitive interface empowers
developers to visualize and fine-tune their creations, fostering an iterative and creative development
cycle. The unity of design and functionality coalesce as the project matures, setting the stage for the
application's deployment.

Deployment, Testing, and Iterative Refinement

With the foundation laid, the focus shifts to configuring build settings within Unity. The Build
Settings menu becomes the gateway to specifying the target platform, initiating a switch to Android
or iOS, and adjusting platform-specific settings such as package names or bundle identifiers.
Building the application finalizes the encapsulation of the project into a standalone file, ready for
deployment.

The deployment phase involves the transfer of the built application to the intended mobile device.
For Android, this necessitates connecting the device to the development machine, enabling USB
debugging, and potentially installing required drivers. iOS deployment, on the other hand, demands
a Mac with Xcode installed and, if necessary, enrollment in the Apple Developer Program to deploy
on a physical iOS device. The actual running of the application on the mobile device marks the
fruition of the development efforts, bringing the Unity-powered creation to life.

27
Implementation Chapter 5

Testing and debugging become paramount in this iterative process. Unity offers a suite of
development tools for real-time testing and debugging, facilitating the identification and resolution
of issues. For Android devices, establishing a direct connection between the Unity editor and the
phone enhances the efficiency of this phase. Meanwhile, for iOS, integrating the project with
Xcode provides developers with a deeper level of insight into performance and behavior.

Security considerations play a role, especially on Android devices, where ensuring the phone's
security settings permit the installation of applications from external sources is essential. Finally,
a commitment to documentation and staying informed about Unity updates ensures that the
development workflow remains aligned with best practices and platform-specific nuances. This
comprehensive approach, from initial setup to deployment, testing, and refinement, empowers
developers to harness the full potential of Unity on mobile devices, creating immersive and
engaging applications.

5.3 SUMMARY:

The collaborative workflow between Blender and Unity in the context of AR Foundation involves a
seamless integration of 3D content creation and game development. Blender, a versatile open-source 3D
graphics software, serves as a comprehensive tool for modeling, sculpting, and animating objects and
scenes. Artists use Blender to craft detailed 3D assets, characters, and animations. Unity, a popular game
development engine, complements Blender by providing a platform for integrating these assets into AR
applications using AR Foundation. This workflow enables developers to import Blender-created assets into
Unity, where they can be arranged, scripted, and deployed for augmented reality experiences. Unity's AR
Foundation extends its capabilities to incorporate AR features, such as image recognition and tracking. The
synergy between Blender and Unity within the AR Foundation framework facilitates a streamlined process
for creating immersive AR content, showcasing the collaborative power of these tools in reshaping
interactive and visually compelling augmented reality experiences.

28
System Testing Chapter 6
CHAPTER 6
SYSTEM TESTING
6.1 TESTING LEVELS:

6.1.1 UNIT TESTING:

In the context of developing an Augmented Reality (AR) application using AR Foundation in


Unity, unit testing plays a crucial role in validating the functionality of individual components. Unit
tests focus on isolating and evaluating specific units of code, such as functions or classes, to ensure
they behave as intended. This involves setting up a testing framework compatible with Unity,
identifying key units for testing (e.g., image recognition, rendering), and creating test cases that cover
various input scenarios. During testing, assertions are used to verify that the actual output matches
the expected output. By automating these tests and incorporating them into the development process,
developers can catch and address issues early, maintain code reliability, and support ongoing
refactoring. For instance, a unit test for an image recognition class may involve providing a test image
path and asserting that the recognition function returns the expected result. This iterative approach to
testing enhances code quality and supports the overall robustness of the AR application.

6.1.2 INTEGRATION TESTING:

Integration testing for an Augmented Reality (AR) application utilizing image processing in
Unity involves validating the seamless interaction and cooperation between different components or
modules within the application. The objective is to ensure that these integrated elements work
harmoniously to deliver the intended functionality. In the context of image processing, this includes
testing the flow of data and operations between modules responsible for image recognition, tracking,
rendering, and other AR features. By assessing how these components collaborate, integration testing
helps identify potential issues such as data inconsistencies, communication errors, or interoperability
challenges. Through this process, developers can catch and address integration-related issues early in
the development lifecycle, ensuring that the AR application functions cohesively and delivers a
unified and reliable user experience.

6.1.3 SYSTEM TESTING:

System testing for an Augmented Reality (AR) application using image processing in Unity
is a comprehensive evaluation of the entire application as a unified system. This testing phase aims
to verify that all individual components, including image recognition, tracking, rendering, and user
29
System Testing Chapter 6

interaction, work harmoniously to meet specified requirements. System testing involves testing the
application in various scenarios and conditions, assessing its behavior under different
environmental settings and user interactions. It ensures that the AR application functions as
intended, placing virtual objects accurately based on image recognition results, and adapting to
real-world changes. Additionally, system testing addresses aspects such as performance, security,
and overall user satisfaction, providing a thorough examination of the application's reliability and
functionality before deployment. Any discovered issues are addressed to guarantee a robust and
user-friendly AR experience.

6.1.4 ACCEPTANCE TESTING:

Acceptance testing for an Augmented Reality (AR) application with image processing in
Unity is the final phase of testing before deployment, focusing on validating that the application
meets the specified business requirements and user expectations. This testing assesses whether the
AR application delivers the intended features and functionalities in a real-world context. Typically
involving end-users or stakeholders, acceptance testing evaluates the application's usability,
performance, and adherence to predefined criteria. Users interact with the AR features, such as
image recognition and object rendering, providing feedback on the overall user experience. Any
necessary adjustments are made based on this feedback to ensure that the application aligns with
business goals and fulfills user needs. Successful acceptance testing indicates that the AR
application is ready for deployment, having undergone thorough evaluation from the perspective of
those who will ultimately use and benefit from it.

6.1.5 PERFORMANCE TESTING:

Performance testing for an Augmented Reality (AR) application incorporating image


processing in Unity is a critical evaluation aimed at assessing the application's responsiveness,
stability, and efficiency under various conditions. This testing phase involves systematically
analyzing the AR application's performance metrics, such as frame rates during image processing,
rendering quality, and overall responsiveness to user interactions. The goal is to identify potential
bottlenecks, memory leaks, or issues related to computational intensity that could impact the user
experience. Performance testing helps ensure that the AR application delivers a smooth and
immersive experience, particularly during image recognition and virtual object rendering, without

30
System Testing Chapter 6

compromising the device's resources. By measuring and optimizing key performance indicators,
developers can enhance the application's efficiency and responsiveness, ensuring it meets the
required standards and provides a seamless AR experience for users.

6.2 SUMMARY:

In summary, testing for an Augmented Reality (AR) application integrating image


processing in Unity involves a multi-faceted approach to ensure the application's reliability,
functionality, and performance. The testing process encompasses various levels, starting with unit
testing where individual components are examined in isolation to validate their behavior.
Integration testing evaluates the seamless interaction between different modules, emphasizing the
interoperability of image recognition, tracking, and rendering components. System testing assesses
the AR application as a whole, considering user interactions, environmental conditions, and overall
functionality. Acceptance testing involves end-users to validate that the application meets business
requirements and provides a satisfactory user experience. Performance testing focuses on
evaluating the application's efficiency, responsiveness, and resource utilization, particularly during
image processing and rendering. Security, usability, and error handling aspects are scrutinized to
ensure a secure, user-friendly, and robust AR experience. Throughout this comprehensive testing
process, the goal is to identify and address potential issues early, providing a high-quality AR
application that aligns with user expectations and business objectives.

31
Coding And Output Chapter 7

CHAPTER 7
CODING AND OUTPUT

7.1 MULTI TARGET IMAGE PROCESSING:

7.1.1 CODING:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR;
using UnityEngine.XR.ARFoundation;

[RequireComponent(typeof(ARTrackedImageManager))]
public class imageTracking : MonoBehaviour
{
[SerializeField]
private GameObject[] placedPrefab;

private Dictionary<string, GameObject> spawnedPrefab = new Dictionary<string, GameObject>();


private ARTrackedImageManager trackedImageManager;

private void Awake()


{
trackedImageManager = FindObjectOfType<ARTrackedImageManager>();

foreach (GameObject prefab in placedPrefab)


{
GameObject newPrefab = Instantiate(prefab, Vector3.zero, Quaternion.identity);
newPrefab.name = prefab.name;
spawnedPrefab.Add(prefab.name, newPrefab);
}
}
private void OnEnable()
{
trackedImageManager.trackedImagesChanged += imageChanged;
}
private void OnDisable()
{
trackedImageManager.trackedImagesChanged -= imageChanged;
}

32
Coding And Output Chapter 7

private void imageChanged(ARTrackedImagesChangedEventArgs eventArgs)


{
foreach (ARTrackedImage trackedImage in eventArgs.added)
{
updateImage(trackedImage);
}
foreach (ARTrackedImage trackedImage in eventArgs.updated)
{
updateImage(trackedImage);
}
foreach (ARTrackedImage trackedImage in eventArgs.removed)
{
spawnedPrefab[trackedImage.name].SetActive(false);
}
}
private void updateImage(ARTrackedImage trackedImage)
{
string name = trackedImage.referenceImage.name;
Vector3 position = trackedImage.transform.position;

GameObject prefab = spawnedPrefab[name];


prefab.transform.position = position;
prefab.SetActive(true);

foreach (GameObject go in spawnedPrefab.Values)


{
if (go.name != name)
{
go.SetActive(false);
}
}
}
}

7.1.2 DESCRIPTION:

The provided C# script is designed for Unity using the AR Foundation package, facilitating image
tracking in augmented reality (AR) applications. Upon initialization, the script creates a
dictionary to manage GameObjects corresponding to images and instantiates them based on a
specified array. The script subscribes to the `trackedImagesChanged` event of the
ARTrackedImageManager in the `OnEnable` method and unsubscribes in the `OnDisable`

33
Coding And Output Chapter 7

method to dynamically respond to changes in tracked images.The `imageChanged` method


handles added, updated, and removed tracked images, calling the`updateImage` method to
position and activate the associated prefab. The `updateImage` method ensures that only the
relevant prefab is visible by deactivating others. Overall, this script establishes a foundation for
AR image tracking, enabling the dynamic placement and manipulation of GameObjects in
response to changes in the AR environment.

7.2 OUTPUT:

7.2.1 SOLAR SYSTEM:

EARTH:

Fig No:7.1 Earth

34
Coding And Output Chapter 7
URANUS:

Fig No:7.2 Uranus

MARS:

Fig No:7.3 Mars

35
Coding And Output Chapter 7

MERCURY:

Fig No:7.4 Mercury

NEPTUNE:

Fig No:7.5 Neptune

36
Coding And Output Chapter 7

SUN:

Fig No:7.6 Sun

SATURN:

Fig No:7.7 Saturn

37
Coding And Output Chapter 7

VENUS:

Fig No:7.8 Venus

7.2.2 SONOMETER:

Fig No:7.9 Sonometer

38
Coding And Output Chapter 7

Fig No:7.10 Sonometer

Fig No:7.11 Sonometer

39
Coding And Output Chapter 7

PROJECT DEVELOPMENT

The " Augmented Horizons of Exploring New Realities in Education" project represents
a groundbreaking endeavor poised to redefine the educational landscape. By harnessing the
robust capabilities of the Unity game engine, this initiative aims to seamlessly integrate
augmented reality technology into the learning process. Through dynamic digital overlays,
students will gain access to a wealth of interactive 3D models, simulations, and virtual field
trips, transcending the limitations of traditional educational materials. The core objective of
this project is to foster deeper understanding and engagement among learners of varied
backgrounds and learning styles.

Meticulous attention has been given to ensuring that the educational content aligns
seamlessly with established curriculum standards, enhancing the relevance and applicability
of the augmented reality experiences. Moreover, a user-centric approach to interface design
guarantees an intuitive and immersive learning experience. Rigorous testing protocols,
including functional and usability testing, have been implemented to ensure the application's
robustness and user-friendliness. Additionally, the project adheres to strict legal and ethical
considerations, safeguarding user privacy and intellectual property rights.

As the project nears its completion, there is a keen anticipation for the transformative impact
it is poised to make in educational spheres. Not only does it promise to enhance traditional
learning methods, but it also lays the foundation for future advancements in augmented
reality technology. Through this project, education is poised to transcend conventional
boundaries, ushering in a new era of interactive and engaging learning experiences.

Fig No:8.8 Dragon on AR


40
Conclusion Chapter 8
CHAPTER 8
CONCLUSION

In conclusion, the integration of Augmented Reality (AR) into education using the Unity platform
is a remarkable stride towards revolutionizing the learning experience. This innovative approach
marries advanced technology with educational content, providing students with interactive and
immersive lessons that transcend the boundaries of traditional teaching materials. By seamlessly
merging the physical and digital realms, AR in education stimulates deeper understanding,
engagement, and retention of knowledge, accommodating diverse learning styles.

The project's meticulous attention to aligning educational content with established curriculum
standards underscores its commitment to enhancing the educational process. Furthermore, the
user-centric design ensures an intuitive and accessible learning experience for all students.
Rigorous testing protocols, alongside adherence to legal and ethical considerations, affirm the
project's dedication to delivering a high-quality, responsible, and secure application.

40
Future Scope

FUTURE SCOPE

The future scope of Augmented Reality (AR) in education is incredibly promising, poised to
revolutionize the way knowledge is acquired and assimilated. As AR technology continues to
advance, it is expected to bring about a paradigm shift in traditional learning methods. Students
will soon find themselves immersed in interactive educational experiences where virtual objects
seamlessly blend with the physical world, enhancing comprehension and retention of complex
concepts. With the advent of AR glasses and wearables, the learning process will become even
more seamless, offering a hands-free and intuitive augmented learning environment. This
technology's potential for personalized learning paths is particularly exciting, as AR applications
can adapt content to suit individual learning styles and preferences.

Moreover, AR's capacity for real-time collaboration among students, regardless of their
physical location, opens up new frontiers for cooperative learning and problem-solving. As the
boundaries of AR's capabilities continue to expand, we can anticipate its integration into an
even wider range of subjects and disciplines, creating opportunities for more interactive and
engaging lessons. In essence, the future of AR in education promises a dynamic and inclusive
learning environment that caters to the diverse needs and preferences of students, ultimately
reshaping the landscape of education.

41
Bibliography

BIBLIOGRAPHY

[1] Billinghurst, M., & Dunser, A. (2012). Augmented Reality in the Classroom. In Mixed
and Augmented Reality (ISMAR), 2012 IEEE International Symposium on (pp. 441- 442).
IEEE.

[2] Dede, C., & Richards, J. (2017). 21st century skills, education, and competitiveness: A
resource and policy guide. Routledge.

[3] FitzGerald, E., Adams, S., Ferguson, R., Gaved, M., Herodotou, C., Hillaire, G., ... &
Wimpenny, K. (2018). Augmented reality and mobile learning: The state of the art.
International Journal of Mobile and Blended Learning, 10(4), 1-17.

[4] Klopfer, E., & Squire, K. (2008). Environmental Detectives—the development of an


augmented reality platform for environmental simulations. Educational Technology
Research and Development, 56(2), 203-228.

[5] Liarokapis, F., & White, M. (2016). Educational augmented reality applications: Where
we are and what is next. Journal of
Interactive Learning Research, 27(4), 325-343.

42

You might also like