Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
216 views33 pages

Template

The document discusses developing an augmented reality indoor navigation system for Holy Trinity College. It would allow users like students, faculty, and visitors to easily navigate around the campus using their smartphones. Currently, navigation in the college is difficult as it is composed of multiple buildings and users can get lost. The proposed AR system aims to solve this problem by displaying directions and waypoints to guide users to their destinations, like classrooms and offices. It would help new students and visitors explore the campus more easily. The scope of the system is described, along with some limitations like only including predetermined locations and requiring compatible Android devices.

Uploaded by

seth
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
216 views33 pages

Template

The document discusses developing an augmented reality indoor navigation system for Holy Trinity College. It would allow users like students, faculty, and visitors to easily navigate around the campus using their smartphones. Currently, navigation in the college is difficult as it is composed of multiple buildings and users can get lost. The proposed AR system aims to solve this problem by displaying directions and waypoints to guide users to their destinations, like classrooms and offices. It would help new students and visitors explore the campus more easily. The scope of the system is described, along with some limitations like only including predetermined locations and requiring compatible Android devices.

Uploaded by

seth
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

HOLY TRINITY COLLEGE

College of Engineering and Technology Education


GENERAL SANTOS CITY

Holy Trinity College Augmented Reality (AR) Indoor


Navigation System

ABBY PACULDO

March 2020
Holy Trinity College Augmented Reality (AR) Indoor
Navigation System

A Capstone Project
Presented to the Faculty of the
College of Engineering and Technology Education
Holy Trinity College of General Santos City

_________________________

In Partial Fulfillment
of the Requirements for the Degree
Bachelor of Science in Information Technology

_________________________

By:

Abby Paculdo

March 2020
CHAPTER 1

INTRODUCTION

Project Context

Navigation has always been a part and plays a great role for humans

when discovering new places. The evolution of different navigation techniques

has helped the human species spread across the planet.

Now that most of the world is well explored, navigation remains an important part

 of our society. Today's technology enables us to use navigation in a whole new

way than our ancestors could have done.

zdvxdvxvxcvxcvzhjxcbzxbczcbnzbcxnznxvczxvnbznxbcvzhdfuashfdmnsz

mncjlslcnjzNcm,zcnm,z,NcjlzshfjlalsncmzcnjlzxhsjldfclansmdcnSZKCXnzxMNXZ

NxkjlZNXcZNxjknJCnzjNCjxzcnmxnjlashfdjasncdasfsdgvxvx

With the rapid development of mobile communication and technology over

the past decade, the need to obtain accurate Indoor navigation system is

increasing. Smartphones have evolved to contain a GPS (Global Positioning

System) unit, and this has given rise to location-based mobile applications such

as geofencing and automotive navigation for the common user. However, GPS is

only able to locate devices as accurate as approximately 10 meters (Ye, 2012) in

an outdoor environment. With regard to this, a number of alternative technologies

have been developed for indoor navigation. Several attempts have been made

for accurate indoor navigation. Existing methods use infrared signals, ultrasound,

signal strength of various wireless connections such as GSM (Global System for
Mobile Communications), Bluetooth, and Wi-Fi, inertial sensors to track user

movements as well as various digital image processing algorithms to the

positioning (Matuszka, Gombos, Kiss, 2013).

The conventional navigation guidance provided for indoor purpose such

as map and signage around the building are not directing in straight forward

manner since they take some time for the users to figure out their exact location

and find the route to their wanted locations. Such condition gets worse in the

buildings with very complicated internal layout design just like the Holy Trinity

Campus.

The need for Indoor Navigation system inside the Holy Trinity College

appears to be increasing along with the continuing development of

establishments inside the campus. As a result of these expansions and

developments, there is a great chance that some students would not be able to

find their way around the campus. With the help of this system, users especially

students can navigate through the campus with ease.

Sgfdfgdfghdfjsdfhsdjfnsjdhisjgkfdnbmxgfnbsjdfkmzsc,xmZndshfdasmnd

cm,scnmZNCxmZNcxZMCnMZcnkalsljfiejtieruteyrdugjhfdjvngxmnbvgsdxbgfdsgd

fgdfhdfhfdhfghjfgjghjghjghjkghkghjdtwerewsf

hjfjghjghjghjghjfhjhjfhjgfhjgfjghjfhjfhjfghjgfhj
Statement of the Problem

Navigating through Holy Trinity College is difficult especially for new

students and visitors. The campus is composed of 5 different building which

makes it more difficult to locate classrooms and offices.

Specifically, the proponents aim to develop a system that would give

solution to the following problems:

1. Can the system provide easy navigation and directions to desired

destinations for the users especially the new students, staffs, and visitors?

2. Can the system provide directions to the nearest and available emergency

exits in the campus during emergency situations?

Objectives of the Study

The study aims to develop the Holy Trinity College Augmented Reality

(AR) Indoor Navigation mobile application.

Specifically, this aims to:


Significance of the Study

The Holy Trinity College Augmented Reality (AR) Indoor Navigation

System will be of great benefit to the following:

● Students: Students gets assigned to different classrooms every term of

each semester of the school year. Some students especially new

enrollees find it hard to navigate through the school’s facilities and be at

their classrooms on time. Therefore, with the use of this application,

students will easily find their way to their classrooms.

● Faculty and Staffs: The Faculty and Staffs especially the newly hired

employees who are still unfamiliar with the school’s structure can open the

app, scan the map’s QR image target and select the office or destination

where they want to go.

● Administrator: The Administrators of the school can install the application

and use it to guide and tour visitors and potential enrollees around the

campus.

● Guests: Visitors can install and use the application to find the location of

the office that they want to visit. The AR technology will make it easier for

guests to navigate through the buildings.


● Future researchers: AR is a new technology with so much to offer in the

world of innovations. This study will serve as a guide and a tool to help

future researchers in developing Augmented Reality applications.

Scope and Limitations of the Study

The scope of the Holy Trinity College AR Indoor Navigation System covers

all users that will visit and navigate through the campus. The user must have a

4G smartphone with at least android 7 (Nougat) operating system or higher.

Users will select a specific target location from the current default destination list

provided in the application which are the VP Academic Office, Computer

Laboratory II, CETE Dean’s Office, IT Center 1, CETE Faculty office and the

three emergency exits namely, College, Highschool and Elementary gate. The

system then calculates the shortest path and displays the path guide to the

selected destination. The information will be displayed as a series of waypoints

which are visualized icons standing in the environment. These icons are

represented by arrows to show the direction that the user should move. Simple

directional information will also be displayed if the user is not able to distinguish

the next waypoint if they are looking into the wrong direction.

The whole Holy Trinity campus is composed of more than 5 buildings and

3D map rendering would consume too much space from the hardware devices to

be used in developing the application. Hence, the application will be limited to


showcasing AR navigation to predetermined locations namely; VP Academic

Office, Computer Laboratory 2, CETE Dean’s Office, IT Center 1, CETE Faculty

Office, and to the three main gates which are the College, Highschool, and

Elementary Gate. A QR image target will be positioned in every possible entry

point in the campus. The current system will use the College gate as the default

entry point. Holy Trinity College AR Indoor Navigation System will only be

developed for android devices. The device must support ARCore via Google Play

Services for AR, which enables AR experiences built with an ARCore SDK. The

application will need high performing smartphones that are capable to handle

heavy AR applications.
Definition of Terms

● fsegfdsgfdgdf - sdgdfghtyutjhsdjfsdbfsbfhsdgfsdhfsdfjs

● sfsdgfsdgdf - sdfsdgfdgdgdfgdfgdgfdggsa

● sgfdgdg - sdgfsdgdfgdfgdfgdfgdgfdgf.
CHAPTER II

REVIEW OF RELATED LITERATURE AND SYSTEMS

Navigation requires constant monitoring of the user’s location and his to

dynamically plan and follow a directed path to a person's destination. Global

Positioning System (GPS) made outdoor navigation relatively straightforward, but

because signals are weaker inside uildings, indoor navigation became difficult to

achieve. However, advances on mobile capabilities have given rise to new

technologies and tools than can help develop and solve indoor navigation

problems.

Rosenberg (1992) created the first real operational and immersive AR

system, Virtual Fixtures. This robotic system overlays information on top of a

worker’s environment to help with efficiency. Virtual fixtures are

computer-generated percepts overlaid on top of the reflection of a remote

environment which can provide similar benefits as tools and fixtures in the real

world. It was developed at the U.S. Air Force Research Laboratory (Armstrong

Labs) while Rosenberg was working as a graduate researcher.

NASA’s flight tests from the year 1998 to 2002, the NASA X-38 was flown

using a Hybrid Synthetic Vision system that overlaid map data on video to

provide enhanced navigation for the spacecraft. It was useful for times of limited

visibility, like frosting of video camera window. AR was first used for navigation

during this development (“Interesting history of Augment ed Reality”, 2018).


A comparative study conducted by Rehman (2016) showed varying results

of using AR on different implementation devices along with using the traditional

mapping through paper map. Results showed that digital navigations like AR

provide turn by turn guidance. AR applications would require less mental

workload, time, and would be perceived as more accurate when compared to

traditional mapping with the use of paper maps. 

In 2009, a mobile AR camera projector unit called Map Torchlight was

developed (Schöning et al. 2009). It uses a projector to superimpose Point of

interests (POIs), streets, and other additional spatial information on top of a

paper map. It was fully implemented to a Nokia mobile camera phone (N95) with

a mobile projector attached to the phone using an audio-video cable. 

Baus and his co-authors investigated map-based mobile guides and

pointed out that mobile navigation system users prefer perspective view over a

bird’s eye view of a map (Baus et al. 2005).

One of the first ever created and deployed Augmented Reality application

on a mobile device was the Invincible train game (Parhizkar et. al., 2012). The

project was created by Daniel Wagner, Thomas Pintaric, and Florian Ledermann

from January 2004 to December 2005. The Invisible Train is the first real

multi-user Augmented Reality application for handheld devices or Personal

Digital Assistant (PDAs). This operates independently on off-the-shelf PDAs -

eliminating the need for expensive resources. The Invisible Train is a mobile

multi-user AR application, where players control virtual trains on a real wooden

miniature railroad track. These virtual trains are can only be seen by the players
via the PDA's video display since they don't exist in the physical world. This type

of interface is called the "magic lens metaphor" (“The Invisible Train: A Handheld

Augmented Reality Game”, n.d). 

Zhong (2014) developed an AR indoor navigation application deployed at

Center for Information Technology Research in the Interest of Society (CITRIS)

Invention Lab at Berkeley where users can navigate through the laboratory with

the help of overlaid augmented graphics layer on top of the camera view. The

goal of the study is to come up with augmented indoor navigation to lab

equipment, step-by-step instructions for devices such as 3D printers and laser

cutters. The application also features device reservation lookup. 

A group augmented reality mobile navigation system supporting indoor

positioning and group communication functions was developed by Wang et.al

(2015) for any type of exhibits. It is a combination of different technologies such

as marker-less image identification, active RFID indoor positioning, and group

communication which allows the users to directly capture and analyze exhibit

pictures with a mobile device. 3D navigation information, and group member’s

real time positions through map interface are also featured in this application.

The system also supports text communication and image sharing functions to

achieve efficient group navigation mode. 

Today, Google Map has been the go-to indoor and outdoor navigation tool

all around the globe. In the US, Google’s indoor maps and navigation has been

activated in more than 250 venues like airports, shopping malls, and universities
and over 10,000 floor plans are available throughout the world. The steady

march

objects they are working with; and in museums, where artifacts can be

tagged with information such as the artifact’s historical context or where it was

discovered (Parhizkar et. al., 2012).

Many different types of devices have been used for AR applications.

Hand-held devices such as tablets and smartphones are equipped with high

definition displays, top quality cameras, high processing speeds and top of the

line sensors that can support accurate tracking are suited for AR applications

(Rehman, 2016).

Today’s smartphones are equipped with many different sensors. Instead of

using Bluetooth or Wi-Fi sensors, the smartphones camera can be used for IPS.

An augmented reality kind of IPS can be done by comparing pictures taken by

the camera with a sample of pictures stored in a database. It is then possible to

position the smartphone (Delail et. al, 2013). The camera can also be used for

visible light communication (Luo et.al, 2017). This approach uses the emitting

light itself, and can provide very good accuracy as well as being free from radio

frequencies.

AR stretches from personal computer to hand held devices and platforms.

A There are a lot of AR mobile applications that have been created and are being

used in various fields deployed on different devices to this date. With your hand

hel devices, either mobile or tablet, AR has transformed the real world and virtual

world creating a real-life experience. 


CHAPTER III

TECHNOLOGY BACKGROUND

Many scientific papers have been written about different technologies

used for Indoor Navigation/Positioning System. Generally, there are two groups

of technologies that can be used to implement Augmented Reality Indoor

Navigation System. These technologies can be categorized into wireless

transmission methods and computer vision methods.

Wireless transmission methods use technologies such as Ultra-wide Band

(UWB), Wireless Local Area Networks (WLAN), and Radio Frequency

Identification (RFID) to localize a device. These technologies require physical

devices installed inside the environment such as Beacons and Wi-Fi Routers.

These technologies do not give accurate results and contains localization errors.

Device like Bluetooth have high latency during detection phase. These

technologies, although very popular, have difficulties estimating the user’s exact

position and orientation, and therefore are not ideal for Augmented Reality

Applications.

However, computer vision methods are more suitable for implementations

of Augmented Reality-based applications. Computer vision is an interdisciplinary

scientific field that refers to computers’ understanding of digital videos and


images. It acquires, processes, analyzes and understands these images, the way

your retina would, and extracts their data, turning them into numbers and

symbols (O'Brien, 2019). In order for the Augmented Reality to work, the

computer must also understand the context of the physical world. Computer

vision technologies processes and embed predefined markers into objects. AR

systems recognizes these objects through cameras and overlays additional

information and contents to the object. Computer vision methods combines real

world and Augmented Reality data.

One popular computer vision method is SLAM (Simultaneous Localization

and Mapping) which originally came from robotics researches. SLAM can provide

geometric position for the AR system. It is capable of building 3D maps of an

environment, along with tracking the location and position of the camera in that

environment. These algorithms estimate the position of the image sensor while

simultaneously modeling the environment to create a map. Knowledge of the

sensor’s position and pose, in combination with the generated 3D map of the

environment, enables the device to accurately navigate the environment.

Another popular computer vision method is based on location-based AR,

in which applications utilizes GPS data and the mobile device’s compass,

accelerometer, and gyroscope. The latter two determine your direction and

orientation, respectively. The accelerometer is how your smartphone knows

which direction you’re facing when you look at a map, while the gyroscope allows
you to change the device’s orientation by rotating it. Location-based AR uses

geolocation, rather than the embedded marker, to display the content. Using the

user’s smartphone camera, the application will localize the device in an

environment based on its geolocation and determines its orientation (Paucher &

Turl, 2010). Therefore, in the current study, we utilized location-based

Augmented Reality with the use of SLAM technology which allows users to

navigate to destinations with directions displayed right on top of physical roads or

path ways in front of the user.

The application will be developed in a game engine called Unity3D. It is

a free cross-platform game engine developed by Unity Technologies. The

application can be developed with Unity’s 2019.2.11f1 version or higher. Unity will

be responsible in stitching together every components and technologies to be

used to develop the application.

AutoCAD is computer-aided design (CAD) software that architects,

engineers, and construction professionals rely on to create precise 2D and 3D

drawings. AutoCAD was used to render the school’s floorplan which was used as

the basis for rendering the 3d map to be used in the application.

SketchUp is a 3D modeling computer program for a wide range of drawing

applications such as architectural, interior design, civil and mechanical engineering,

film, and video game design — and available in a freeware version, SketchUp Make,

and a paid version with additional functionality, SketchUp Pro. SketchUp was used to
render the 3d map that was used in the system. The version used was Sketchup

2014 with the plugin V-Ray which is a twin-engine rendering architecture built to

utilize the latest CPU and GPU computing technology. It has a large array of tools

that allows rendering faster and easier. Using this software, the initial 3d model was

developed.

Finalization of the 3D Map was done in Revit (2019) which is another 3d

modeling BIM software that allows for intelligent, 3D and parametric object-based

design. In this way, Revit provides full bi-directional associativity. Different

components for interior and exterior designs (e.g. color, pots, plants, etc.) were

added to the map.

Microsoft Visual Studio is an integrated development environment for

Microsoft Windows. It is a tool for writing computer programs, websites, web

apps, and web services. It includes a code editor, debugger, GUI design tool, and

database schema designer, and supports most major revision control systems. It

is available in both a free "Community" edition as well as a paid commercial

version. Visual studio 2019 is a programming IDE option inside the latest

versions of Unity3d which can use both C# and JavaScript as the scripting

language to compile video-games inside Unity’s built in compiler.

ARFoundation 3.0 allows you to work with augmented reality platforms in

a multi-platform way within Unity. This package presents an interface for Unity

developers to use, but does not implement any AR features itself. ARFoundation
is a package that deals with devices that support world tracking. This package

allows installation of ARCore plugin that implements camera, depth, input,

planes, raycast, reference points, and session XR Subsystems.

ZXing Unity Library is a unity plugin used to generate the QR image target

for the map. For now, the system will use 1 QR image target which will be placed

by default at the college gate. Scanning the QR image target will allow the user

to proceed to the next interface where he can choose his destination.

Database/Datasets or the ARCore Image Database supports reading,

adding, updating, and removing features from a dataset. ARCore image database

can store up to 1,000 images information. Image Database contains the rendered

3D Map of the area and the QR code generated from the ZXing Unity Library

plugin.
CHAPTER IV

METHODOLOGY

Software Development Life Cycle Model

In developing the mobile augmented reality applications, we use the

Iterative Waterfall Model development method. This model is a systematic

approach and sequence starting from the system of level requirements and then

headed to the stage of analysis, design, coding, testing / verification, and

maintenance. The iterative waterfall model provides feedback paths from every

phase to its preceding phases, which is the main difference from the classical

waterfall model.

Figure 1: Iterative Waterfall Model


Requirements and Analysis

In this stage, requirements and data gathering, analysis, and planning is

intensified and focused on the needs of the application to be developed. To know

the nature of the application to be made, the proponents must understand the

information domain of the software, for example, the functions needed, UI, and

the data to be processed. The researchers went through a series of consultations

with their adviser and conducted a survey to the students to understand their

struggles in moving around the campus. The data gathered were then analyzed

to point out the right technologies, methods and understand the business

process to be used in the next stages.

Design

After thorough analysis, designing stage comes in. On this stage, the

researchers will establish and enforce the requirements needed to support the

findings in the previous stage and to prepare for the next stage which is the

development of the application. After finding the recommended technologies,

language, IDE, and SDK, an initial UI will then be designed for the application. A

prototype was created using storyboarding plugin in Microsoft PowerPoint with

the help of Photoshop. All available documentations including demos, tutorials,

codes, articles and previously developed related applications were secured to

prepare for the next stage.


Development

Setting up all needed software and hardware requirements to be used to

implement the data and UI design created from the previous pages are done on

this stage. The data gathered and the design created from the previous stages

are transformed into pieces of codes, compiled together to form the whole

application. The UI development started with creating the home scene which will

appear after opening the application. Then a second scene was made for the QR

Image target scanner. Another scene was made as the main UI of the

application. Knowledge gained from demos and tutorials by the researchers are

also being used here.

Validation

On this stage, data from the planning stage are cross checked with the

application created on the development stage. In this stage, all application

functions must be tested, so that the application is free from bugs and errors, and

results should be strictly in accordance with the needs that have been defined

previously. Thorough testing and evaluation are needed to make sure the current

developed application is in line with the needs of the users and the objectives of

the system. Feedbacks from the validation phase allows the researcher to rework

the errors. Changes will be applied before the deployment of the application.
Deployment

This is the last stage of the process where the researcher can now deploy

the application after going through all the stages, tests, and changes to meet the

requirements and objectives stated in the previous stages. The application must

meet the objectives and answer the problems stated in the previous chapters.
CHAPTER V

SYSTEM ANALYSIS, DESIGN, CONCEPTUAL FRAMEWORK AND

IMPLEMENTATION

System Analysis

The figure below shows the System Architecture of the Holy Trinity

College AR Indoor Navigation System. The softwares used to render the floor

plan and 3d map is Autocad and Revit. The programming IDE used was

Microsoft Visual Studio using C# language. Unity 3Ds ARFoundation Library with

ARCore XR Plugin made world tracking and AR Navigation System possible in

this application.

Figure 1: System Architecture


The figure below shows the Use Case Diagram of the Holy Trinity College

AR Indoor Navigation System. The system can be accessed by two actors, the

user and admin. The image shows the context and the interactions of the actors

to the system.

Figure 2: Use Case Diagram

The figure below shows the activity diagram of the system. The image

shows the different activities inside the system that the user may take.
Figure 3: Activity Diagram

The figure below shows the activity diagram of the system. The image shows the

different activities inside the system that the user may take.
Context Diagram

Figure 5: Context Diagram

Description
Level 0 Diagram

Figure 6: Level 0 Diagram


Conceptual Framework
System Design and Implementation

Database Design

Figure 7: Entity Relationship Diagram

Figure 8: Fully Attributed Data Model


Hardware Requirements

This section shows the minimum requirement that the researcher used to

develop the application as well as the user’s smartphone compatible with the

technologies used;

● Processor – Intel(R) Core (TM) i7-8750H CPU @ 2.20GHz or higher

● Hard Drive – 1TB HDD and 250G SDD

● Memory – 8GB or higher

● Smartphone’s Memory – 2.0GB or higher

● Smartphone’s Screen – 4.0 in. or higher

Software Requirements

This section shows the minimum requirement of the computer and

smartphone’s software that the researcher used to develop and create the

application;

● Operating System – Windows 10 64bit

● Unity3D – Version 2019.2.11f1 or higher

● Android OS - Version 7.0 (Nougat) or higher

● AutoCAD – Version 2014 or higher

● SketchUp – Version 2014 or higher

● Java JDK – jdk1.8.0_152

● Android SDK and NDK


REFERENCES

Aditya, D. (2012). Online Mobile Shopping. Retrieved from


https://www.scribd.com/document/117021012/MOBILE-SHOPPING

Basuel, V., Cruz, P. G., Delgago, K. A., Gregorio, K. N., Lotardo, J. M., &
Naldoza, V. E. (2015). UST e-PE Uniform Inventory System
(e-PIS). Retrieved from
https://www.scribd.com/document/260903718/Inventory-System

Boguat, J., Ventura, A., Oliver, M. K., & Juelo, E. (2014, March 5).
Computerized Inventory System and POS of Brother Burger.
Retrieved from JOnick Boguat DIT:
https://www.slideshare.net/JONICK_BOGUAT09/full-docu-it-thesis-
project-in-computerized-inventory-system-in-brother-burger-visual-b
asic-60-back-end-ms-access

Chittil, S., Chittil, N., & Rishikese, R. (2014). Retrieved from


http://dspace.cusat.ac.in/jspui/bitstream/123456789/8252/1/onlines
hoppingsystem.pdf

Hopson, D. B., & Keys, K. S. (2002; 2003; 2006). United States Patent No.
US7124098B2. Retrieved from
https://patents.google.com/patent/US7124098B2/en

Hopson, D., & Keys, K. (2006). Online Shopping System. Retrieved from
https://patents.google.com/patent/US7124098B2/en

Kheng, O. C. (2015). Development Of A Computerized Inventory


Management System (IMS) For Industry Application. Retrieved
from
http://umpir.ump.edu.my/id/eprint/13111/1/FKP%20-%20OOI%20C
HOON%20KHENG%20-%20CD%209728.pdf

Lad, R., Lal, J., Laad, M., & Thanvi, K. (2016). ShopZoo: A Grocery
Shopping Mobile Application with Smart Recommendation System.
In International Journal on Recent and Innovation Trends in
Computing and Communication (IJRITCC) (pp. 48 - 51).

Lehtinen, V., & Kristersson, J. (2018). A Framework for Integrating


Shopping Cart Software in Mobile Applications (FISCSiMA).
Retrieved from https://www.essays.se/essay/d936dd1c5e/

Licina, A., Radtke, H., & Johansson, C. (2018). Usability in M-commerce :


Critical factors to consider when adapting m-commerce. Retrieved
from https://www.essays.se/essay/f4ab9f29bb/
Magerer, L. K. (2017). A Mobile based accounting and sales management
system for small retail shops. Strathmore University. Retrieved from
https://su-plus.strathmore.edu/handle/11071/5714?show=full

Mumtaz, A. (n.d.). Mobile Intelligent Shopping System. Retrieved from


https://www.scribd.com/document/383012898/mobile-intelligent-sho
pping-system

Mutua, J. K. (2017). A Mobile And Web Application To Track Availability Of


Essential Medicines In Pharmacies. Strathmore University.
Retrieved from
https://su-plus.strathmore.edu/handle/11071/5703?show=full

Onn, C. W., Ching, C. S., Yeen, C. W., & Hong, L. L. (2015). Development
of Simplified Inventory System with Online. Retrieved from
https://www.scribd.com/document/293903166/Simplified-Inventory-
System-With-Online

Patel, P. B. (n.d.). Mobile Store Management System. Retrieved from


http://sdsu-dspace.calstate.edu/bitstream/handle/10211.10/3459/Pa
tel_Park.pdf;sequence=1

Prasa, S. (011, December 8). Mobile Store Management System.


Retrieved from Free Student Projects:
https://www.freestudentprojects.com/studentprojectreport/projectrep
ort/mobile-store-management-system/

Purohit, A., Koshiya, A., Patel, J. S., Patel, P., Biswas, P., Dsouza, R., &
Mautya, V. (2010). Inventory Management for Financial
Management. Retrieved from
https://www.scribd.com/doc/25911219/inventory-management-with-
practical-example

Ricky, M. Y. (2014, March 28). Mobile Food Ordering Application using


Android OS Platform. Bina Nusantara University, Jakarta,
Indonesia: Computer Science Department, School of Computer
Science.

Sen, D. (2009). Mobile Store Management System. Retrieved from


https://www.scribd.com/doc/97508346/Mobile-Store-Management-S
ystem

Webb, C. J. (2011). Method of and system for group meal ordering via
mobile devices. Retrieved from
https://patents.google.com/patent/US20120036028A1/en

Wu, A. (2013). On-Site Ordering System for Mobile Devices. Retrieved


from https://patents.google.com/patent/US20140229301A1/en
Wu, A. (2013). United States Patent No. US20140229301A1. Retrieved
from https://patents.google.com/patent/US20140229301A1/en

You might also like