Feras Exams Wiley Revised r2
Feras Exams Wiley Revised r2
net/publication/332875587
CITATIONS READS
19 6,382
4 authors:
All content following this page was uploaded by Feras Al-hawari on 06 May 2019.
1
Information Systems and Technology Center, 2School of Basic Sciences and Humanities, German
Jordanian University, Amman, Jordan.
Abstract: A web-based examination management system that is implemented in-house is discussed in this paper.
It is designed based on the Java enterprise edition three-tier architecture. It also allows defining and setting up
exams according to a flexible tree-based exams structure. Moreover, it integrates a rich text editor for composing
exams suitable for different engineering and language disciplines. In addition, it automates the scheduling, grading
and reporting processes in order to relieve instructors from such cumbersome tasks. Furthermore, its capabilities
and integration with different databases enable it to offer several security schemes that support strong multifactor
authentication and authorization, detect impersonation and prevent cheating. Besides, it provides an easy to use
and informative wizard that enables students to take exams. Not to mention, the deployment results illustrate that
the system has been successfully used to organize online exams in several semesters over the past three academic
years. Finally, the conducted user surveys responses assert that the system is also user friendly, cutting edge,
capable, reliable, fast and highly available.
Keywords: online exams, web-based application, three-tier architecture, ISO/IEC 29110, software engineering
and application security.
1. INTRODUCTION
Computerized exams have gained a lot of popularity in the past years as they support e-learning
and facilitate the management (i.e., setup, scheduling and grading) of exams that are taken by a large
number of students. However, adopting a suitable system that meets the needs of a certain institution
can be challenging due to several concerns related to integration with existing systems, security,
customization, ease of use and cost. Hence, the main aim of this paper is to introduce a web-based
Examination Management System (EMS) that has been developed in-house and is fully integrated with
the MyGJU portal [1] in order to tackle the previous issues at the German Jordanian University (GJU).
The EMS has been mainly developed to offer an online-based replacement for the Arabic,
mathematics and engineering paper-based placement exams that are conducted every semester at the
GJU to determine the competence of the newly accepted students in the respective fields. Typically,
such exams are taken by a large number of students and therefore used to require significant preparation
time as well as cost more in terms of paper, ink and labour when the exams were paper-based.
Furthermore, the paper-based exams entailed a manual grading process that is time consuming,
cumbersome and error-prone. Hence, the availability of an online exam system was imperative to
overcome the aforementioned issues pertaining to paper-based exams.
Nevertheless, the adopted online exam system needs to address the following main concerns to
satisfy the desired GJU requirements:
Provide an exam administration portal that allows administrators to manage user roles and enables
instructors to define exam types, specify exam topics, compose topic questions for different
disciplines (e.g., languages, engineering and mathematics), generate random exam forms, manage
exam sessions, assign students to sessions, compute student grades in the click of a button, as well
as generate reports for sessions, students and grades.
Support an exam wizard that is easy to use and enables students to take the designated exams.
Implement different security schemes to prevent unauthorized system access, eliminate cheating,
avoid impersonation, control the exam rooms, encrypt the communicated data, protect the servers
and secure the database.
Use Single Sign On (SSO) based authentication to enable administrators, instructors and students
to login to the exam system using the same credentials (i.e., username and password) that they use
to login to other systems (e.g., MyGJU portal and E-Mail). Accordingly, users do not have to
memorize several passwords and administrators are relieved from the cumbersome task of assigning
new credentials to every user.
1
Integration with the Student Information System (SIS) [2] that is implemented in-house and used
by registrars to manage admission, registration and student information at GJU. For example, the
SIS supports features such as: system setup (e.g., to define buildings, rooms, faculties, departments
and majors), academic setup (e.g., to manage semesters, courses and course sections), admission,
managing student information (e.g., personal, academic, schedules and transcripts), registration
(i.e., enrolling students in course sections), computing grades and graduation. Accordingly, since
the exam system relies on a lot of the SIS data (e.g., buildings, rooms, semesters, courses, students,
majors, etc.), it must be integrated with the SIS to avoid functionality duplication (e.g.,
implementing the manage buildings, define semester and add student screens in both systems) and
prevent information inconsistency (e.g., same student with different names in each system), which
reduces the setup steps and thus improves usability. Similarly, the SIS should also gain direct access
to the exams data and hence make it available in the respective student accounts.
Integration with the Accounting Information System (AIS) [3] that is also developed in-house and
allows the finance department at GJU to define fees (e.g., service and tuition fees), manage daily
financial transactions (e.g., payments and refunds), issue registration invoices and administer
student financial accounts. Respectively, the exam system needs to gain direct access to the student
financial accounts and thus eliminate the need to import that information in order to validate
whether or not a student has paid a required exam fee.
Meet the desired software quality attributes such as ease of use, responsiveness, reliability and
availability.
Correspondingly, the EMS has been developed while taking the previous requirements into
consideration. For instance, the system is fully integrated with the SIS and AIS for seamless data
exchange and reduction in setup steps. It also provides multifactor authentication and authorization for
stronger security. Yet, it supports a tree-based exams structure that makes managing and archiving
exams easier. In addition, it integrates a rich text editor that allows composing questions and answers
for different majors. Moreover, it is capable of generating many forms of an exam to prevent cheating.
Besides, it automates the scheduling and grading of exams in order to relieve instructors from such
cumbersome tasks. Furthermore, it enables the generation of various reports that facilitate the analysis
and documentation of exams.
The rest of the paper is organized as follows: a comparison with related work is presented in section
2; the instructor and student views besides the main EMS capabilities are discussed in section 3; the
algorithms to generate exam forms, schedule students to sessions, as well as compute students grades
are shown in section 4; the database design is explained in section 5; the project management and
software development processes in addition to the three-tier application architecture are considered in
section 6; the system security schemes are clarified in section 7; the deployment results and user survey
responses are illustrated in section 8; and a summary and conclusions are touched upon in section 9.
2. RELATED WORK
The full design of computerized examination systems was explored by many researchers such as
those in [4-20]. For example, the challenges when introducing electronic exams were explored in [4,5].
In addition, the trials results for several e-Exams systems were discussed in [6] with focus on making
those systems convenient, reliable and educationally beneficial. In [7], an online exam system that
supports the automatic generation of digital circuits questions was introduced. Also, an online English
examination system based on the J2EE platform was shown in [8], whereas an Arabic web-based exams
management system was explained in [9]. Besides, several systems such as those in [10-12] were
developed to support online exams on mobile platforms (e.g., Android as in [10]) and devices (e.g.,
tablets as in [12]). Furthermore, a web-services based online training and exams system was introduced
in [13]. Moreover, the development of a three-tier online exam system that is based on Struts, Spring
and Hibernate (SSH) was discussed in [14,15]. In [16], an e-Exam platform that uses the iSpring
QuizMaker software to generate an e-Exam package to be displayed in a special e-Exam browser was
proposed. Additionally, an Online Assessment and Evaluation System (OAES) that allows authoring
2
an exam using 15 item types was introduced in [17]. Yet, an e-Exam system that categorizes questions
based on the course learning outcomes was presented in [18]. In [19], an adaptive online exam system
that adjusts the difficulty level of the test based on the measured student competence was introduced.
Further, an online exam system that is capable of resumption after failures and allows the selection of
random questions was explained in [20].
Several systems such as those in [21-23] used the open source Moodle Learning Management
System (LMS) [24] to offer online exams. For example, an online e-Exam platform that uses the robust
Moodle quiz module and can survive system crashes as well as network outages was discussed in [21].
Moreover, the aspects related to the transition from paper tests to Moodle based electronic exams were
explored in [22]. Accordingly, Moodle supports defining a questions bank with different categories,
adding different question types (e.g., multiple choice, ordering, word select, short answer, algebra, etc.)
under each category, associating a quiz activity with a course, selecting quiz questions from the bank,
and shuffling the quiz questions for the student groups. On the other hand, it was concluded that it is
not easy to enable users to login to Moodle using their Single Sign On (SSO) credentials (i.e., setup the
authentication plugin) or allow Moodle to access the students enrolments in courses from the external
SIS database (i.e., configure the external database enrolment plugin). In addition, in [23] it was decided
that the standard Moodle quiz module needs to be supported by identification, session locking and
proctoring modules.
On the other hand, the research in [25-35] only addressed the issues related to the security of online
exams. A review that covered user authentication methods (e.g., knowledge, possession and biometric
based methods) and threats in online examination was presented in [25]. Moreover, the security lapses
of the online exam systems in Nigeria were identified and solutions for them were proposed in [26].
Also, a secure biometric based online exams authentication was introduced in [27]. In [28], a Secure
Exams Management System (SEMS) for mobile learning environments was discussed to tackle issues
such as: guaranteeing that the students are performing the exam in the dedicated classrooms, anti-
impersonation and preventing students from exchanging mobile devices during the exam. In addition,
the Security Control System on the Online Exam (SeCOnE) was presented in [29], in which a proctor
function is provided to remote examinees to prevent cheating. Additionally, a method was proposed in
[30] to improve the security level during an online examination by assimilating live video as well as a
physical token that is carried by the user. Furthermore, face recognition based approaches to verify the
identity of the e-Exam participants were introduced in [31,32]. In [33], a multimodal biometric
authentication framework was introduced to eliminate the threat of impersonation after the login phase.
Besides, an e-Exam protocol (named Remark!) that achieves heterogeneous security properties by
giving each principal a public/private pair of keys and a smart card that stores the principal's identity
was proposed in [34]. Whereas, an Extra Secure-DES (XS-DES) method was introduced in [35] to
encrypt the exam score before saving it in the database and to encrypt the student identity before sending
it to the examiner for grading.
A key features comparison between the EMS and its counterparts [7-20] is shown in Table 1. It is
worth noting that, the methods in [25-35] were excluded from that comparison as they only focused on
the security aspects of online exams rather than the system as whole as in this paper. Accordingly, the
EMS is the only system that enables SSO authentication and thus does not require administrators to
specify a username and password for every user. In addition, it is one of few systems that tighten the
authentication with more than two factors (subsection 7.1). Again, it is the only system that is integrated
with other systems (i.e., SIS and AIS), which eliminates the need to enter the students, instructors,
semesters, buildings, rooms, departments, majors and financial records in the EMS in order to setup the
exams and thus makes the EMS easier to use. Furthermore, unlike all related work, the EMS addresses
the organization of exams into a flexible tree-based structure that allows defining different exam types
and associating them with exams in different academic semesters and sessions (subsection 3.1). In
addition, it is the only system that supports automatic scheduling of students to exam sessions
(subsection 4.2). Like some systems, the EMS supports different exam types (e.g., Arabic, English,
engineering and mathematics) and is capable of resumption after failures. On the other hand, some
3
systems provide much more questions types than the two types that are supported in the EMS.
Nevertheless, unlike most of the related work, this work addressed the security of online exams from
different perspectives (section 7) and illustrated all the software quality attributes of the EMS via student
and instructor surveys (subsection 8.2).
Finally, although the Moodle LMS supports conducting online exams with many question types,
however it cannot support some important features similarly to the EMS. For example, it does not
support more than two authentication factors. In addition, it is not possible to fully integrate Moodle
with other existing systems (e.g., SIS and AIS). Furthermore, Moodle does not allow customizing the
structure of the exams, content of the reports and layout of the exam screens based on the ongoing needs
of the GJU.
Table 1. Comparison to related work
SSO Auth. Less Flexible Exam Question Automatic Resume Security Software
Steps Setup Exam Types Types Scheduling Exam Quality
due to Structure of Students Attributes
Integ. Tree to Sessions
with
Other
Tools
Application Usability
English Multiple
Web- Desktops Capability
Arabic Choice
based Yes 4 Yes Yes Yes Yes Rooms Speed
Math. &
EMS Servers Reliability
Eng. True/False
Network Availability
Vafeia. Multiple
No 2 No No Eng. No No N.A. N.A.
et al [7] Choice
Xiaoyu Application
No 2 No No English 3 Types No No N.A.
et al [8] Servers
Rashad
No 2 No No Arabic 5 Types No No N.A. Usability
et al [9]
Usability
Tufekci et Multiple
No 2 No No N.A. No No N.A. Capability
al [10] Choice
Availability
Meletio. Servers
No 2 No No N.A. N.A. No No N.A.
et al [11] Network
Gramoll Multiple Application
No 3 No No Eng. No No Usability
[12] Choice Tablets
Pang et
No 2 No No N.A. N.A. No Yes N.A. N.A.
al [13]
Darong
No 2 No No N.A. N.A. No Yes Application N.A.
et al [14]
Sun [15] No 2 No No N.A. 2 Types No No N.A. N.A.
Al-
28
Hakeem No 2 No No 11 Types No No N.A. Usability
et al [16] Subjects
Rama. et Application
No 2 No No N.A. 19 Types No No N.A.
al [17] Network
Bardesi Medicine Multiple Usability
No 2 No No No No N.A.
et al [18] Physics Choice Availability
Application
Yağci et
No 2 No No Eng. N.A. No No Servers N.A.
al [19]
Network
Younis
No 3 No No N.A. N.A. No Yes Application N.A.
et al [20]
4
3. USER VIEWS AND CAPABILITIES
The EMS supports the following three user roles: instructor, student and administrator. Instructors
are responsible for defining and administering their assigned exams. Whereas, a student may login to
the system and take a designated exam at the allotted date and time. On the other hand, an administrator
may perform all the instructor’s tasks as well as grant/revoke certain roles to/from users. The tasks that
could be performed by instructors and students are summarized in the use case diagram in Figure 1.
Note that the administrator user role in not shown in that Figure as it is totally the same as the instructor
role with an extra use case for managing (i.e., granting and revoking) the user roles. The system views
that enable instructors and students to accomplish their required tasks are discussed in the following
subsections.
Figure 1. The use case diagram for the instructor and student views in the EMS
Figure 2: An example showing the relationships between the different objects that are used in the EMS
5
3.1. Instructor View
An instructor is required to perform several tasks in order to be able to define and administer the
designated exams. Such tasks are performed in a top-down approach in accordance with a tree-based
data structure that is adopted by the EMS to internally represent exams. An example on that structure
is shown in Figure 2 in which two exam types (i.e., Circuits and Arabic) are defined. Accordingly, the
Circuits exam can be held on the second 2018/2019 semester, whereas the Arabic exam can be
conducted on the first 2018/2019 and the second 2017/2018 semesters. Moreover, the Arabic exam
instance for the first 2018/2019 semester takes place in the 1-2PM (on October, 1, 2018) and the 2-3PM
(on October 2, 2018) sessions, in which 40 and 60 students are scheduled respectively. Furthermore, a
form of the exam is randomly picked (from a pool of 50 forms as in this example) and then assigned to
each corresponding student upon logging into any of the two aforementioned exam instance sessions.
The details of how an instructor may perform the previous, and other related, tasks is discussed next.
Managing exam types (use case i1 in Figure 1). The EMS enables instructors to manage (i.e., add,
edit, activate and deactivate) exam types as shown in Figure 3. Accordingly, an exam can be currently
conducted using either the Arabic or English language, while noting that the system is capable of
supporting other languages as needed.
Managing exam sections (use case i2 in Figure 1). An exam consists of several sections corresponding
to the various topics to be covered in the exam. The user can manage the exam sections as shown in the
example in Figure 4. Based on that, the user defined 6 exam sections (i.e., Basic Circuit Variables and
Elements, Kirchhoff’s Laws, AC Steady state, etc.) for the Circuits exam type.
Managing an exam section questions (use case i3 in Figure 1). Each exam section can be associated
with a pool of questions via the manage exam section questions screen shown in Figure 5. Accordingly,
the instructor may add, edit and view the exam section questions as well as their corresponding answers.
Currently, the system supports multiple choice and true/false question types. Ideally, the system must
be capable of generating many different forms of the exam in order to eliminate cheating. Hence, an
exam section pool must contain a much larger number of questions than the number of questions to be
randomly picked from the pool while constructing a specific exam form. Not to mention, the feature
rich CK Editor [36] is integrated within the system (see Figure 6) to edit questions and answers suitable
for all GJU disciplines (e.g., mathematics, engineering and languages). It supports many useful features
such as: formatting style, checking spelling, embedding videos, inserting images, adding web links, as
well as defining mathematical equations using latex.
6
Figure 4. The manage exam sections screen
Figure 6. The rich text editor used to compose questions and answers containing mathematical formulas
7
Figure 7. The manage exam instances screen
Managing exam instances (use case i4 in Figure 1). One of the advantages of the EMS is its ability
to associate an exam with many instances based on the academic semesters (e.g., First 2018/2019)
during which the exam sessions will be held. The basic information of each instance are: its duration
(i.e., the start and end dates during which the exam instance sessions may be scheduled), its academic
semester, its number of questions (i.e., the number of questions to be answered by students who are
taking the exam during the corresponding exam instance), its pass grade and its maximum grade as
shown in Figure 7.
Managing an exam instance sections (use case i5 in Figure 1). The number of exam sections that
should be included in the form of a certain exam instance can be customized each semester, if needed,
by selecting the desired sections from the global exam sections of that instance as shown in Figure 8.
Furthermore, the number of questions that comprise each selected exam instance section must be
specified at this stage. Based on the user input in Figure 8, the EMS will randomly pick 10, 5, 5, 5 and
5 questions from the question pools of the first five exam sections respectively when generating any
form for the respective exam instance.
Managing an exam instance forms (use case i6 in Figure 1). The system supports the automatic
generation of any number of forms for each exam instance using the algorithm that is discussed in
subsection 4.1. For example, the information in Figure 9 illustrates that 10 forms were generated for the
Circuits exam instance. Noting that as the number of generated forms increases, the possibility that any
two students in a certain session will have the same form decreases. Hence, the chances of cheating can
be almost eliminated when making the number of forms larger than the maximum session capacity.
Consequently, one of those forms will be assigned randomly to each student when they login to take
their designated exams.
8
Figure 9. The screen to manage the exam instance forms
Managing an exam instance sessions (use case i7 in Figure 1). The EMS allows instructors to manage
(i.e., add, edit, activate, deactivate and view) the exam instance sessions during which the exams will
be held via the screen in Figure 10. A session can be defined by specifying its period (i.e., its start and
end times on a specific date), assigning it to a specific room and entering its capacity. For example, the
Circuits exam instance can be held in four different sessions as illustrated in Figure 10. It is also worth
noting that, each session is associated with an automatically generated 8 digit random numeric password
upon its creation. For security reasons, the session password must be kept confidential and should only
be disclosed by the instructor at the beginning of the exam in the exam room as students will need it to
login into that exam. Furthermore, as an extra security measure, the instructor may reset the password
9
of any session at any time by selecting the desired session and clicking on the Reset Password button
in Figure 10.
Managing an exam instance students (use case i8 in Figure 1). Another strength of the EMS is its
integration with the SIS [2] and AIS [3], which enables it to access any needed student information such
as: personal information, enrollment year, school, department, major, schedules, grades, exams, holds,
notes, payments and account balance. Accordingly, the instructors can easily find the students who are
eligible (e.g., are required to take the exam, have paid the due fees and are clear from any holds) to take
a certain exam. An eligible students list may then be associated with either an exam instance or an exam
instance session using the Upload Exam Students button (in Figure 11) or the Upload Session Students
button (in Figure 10) respectively. In addition, the students can be distributed automatically on the
available sessions using the algorithm that is described in subsection 4.2 upon clicking the Assign
Students to Sessions button in Figure 11. Furthermore, the transfer students feature (not shown here)
can be accessed upon clicking the Transfer Students button in order to conveniently move some of the
already scheduled students from one session to another as needed.
Computing grades (use case i9 in Figure 1). The EMS is also capable of computing the grades of all
students in all exam instance sessions (by clicking on the Compute Grades button in Figure 7) or in a
specific exam instance session (by clicking on the Compute Grades button in Figure 10) in the click of
the respective button (i.e., very quickly) using the algorithm that is discussed in subsection 4.3.
10
Generating reports (use case i10 in Figure 1). The system supports the generation of several reports
to help instructors filter out and obtain the needed data to either summarize, verify, analyze or archive
the exam instances in addition to their sessions, students and grades. For example, the generated report
in Figure 12 shows the summary of the Arabic exam instance for the First 2017/2018 semester. It
includes information such as: the number of students who were eligible for the exam, the number of
students who took the exam, the min and max students grades, as well as the number and percentage of
the students who passed, failed and did not attend the exam. Furthermore, the report includes
information about each student who was eligible for that exam like: student Id, name, grade, remark
(i.e., pass, fail, absent and not started) and form number (to verify the computed student grade and
answers when required).
3.2. Student View
The student view allows students to login to the EMS in order to take their designated exams. The
main component in this view is the easy to use and informative wizard (see Figure 13) that enables
students to answer and navigate between questions (i.e., use case s1 in Figure 1). An answer is saved in
the database immediately upon its selection to enable the system to recover from failures (e.g., browser
crash) with negligible data loss. The Next button is used to navigate to the next question, whereas the
Back button is used to go back to the previous question. Furthermore, the convenient My Answers page
(i.e., use case s2 in Figure 1) provides the status (i.e., answered or not) of each question in the student
exam and allows going back to answer any question (regardless of its order) at the click of a link (i.e.,
the Go Back to Question link) as shown in Figure 14.
Additionally, important information such as remaining time and number of questions are also
presented in the wizard. The remaining time is stored in the database every minute to allow the system
to recover from failures (e.g., closing a browser by mistake) and resume the exam for the respective
student from the last remaining time, given that the exam time is not over. Moreover, the student
information (i.e., the student: name, Id, picture, enrollment year, degree, school, department and major
as shown in Figure 15) are easily accessible (by clicking on the Student Information button) for
verification and security purposes (i.e., use case s3 in Figure 1). Besides, the exam instructions can be
viewed (i.e., use case s4 in Figure 1) at any moment during the exam by clicking on the Exam
Instructions button. Finally, the student may navigate to the finish exam page to complete the exam by
clicking on the Finish the Exam link. Consequently, a student will be denied access to an exam upon
finishing it, even if the exam time is not over.
Figure 13. The student view wizard that enables students to take an exam
11
Figure 14. The My Answers screen in the student view
Figure 16. The algorithm to generate the desired number of exam instance forms
Figure 17. The algorithm to assign the exam instance students to the exam instance sessions
12
Figure 18. The algorithm to compute the grades for all students in all exam instance sessions
4. AUTOMATION ALGORITHMS
The algorithms used by the EMS to automatically generate any number of forms (to prevent
cheating), assign the students to the various exam instance sessions (to reduce exam setup time) and
compute the grades for many students (to relieve instructors from this cumbersome task) are discussed
in the following subsections.
4.1. Forms Generation Algorithm
The pseudo code for the forms generation algorithm is shown in Figure 16. The main loop to
generate any user specified number of forms (i.e., numberOfFormsToGenerate) starts at line 1 and ends
at line 9. At line 2, the object and database record for the nth form to be generated are initialized. Then,
the algorithm iterates over the exam instance sections (i.e., that were selected using the screen in Figure
8) in the foreach loop that starts and ends at line 3 and line 8 respectively. The questions that constitute
the exam instance section under consideration are determined within the loop block that starts at line 4
and ends at line 7. Specifically at line 5, each respective question is picked randomly from the
corresponding exam section questions pool, which usually contains many more questions than the
number of questions (i.e., that was specified in the screen in Figure 5) to be used in any exam instance
section. Finally, the selected question object is added to the corresponding form object at line 6.
4.2. Students to Sessions Assignment Algorithm
The current policy at GJU is to schedule the different online exam instances at non-overlapping
periods (i.e., the possibility of a student having two different online exams at the same period is
eliminated). Accordingly, the EMS uses a simple scheduling algorithm that distributes the students one
by one over the available seats in the different sessions. In case that policy changes in the future, the
EMS can utilize the three-phase ILP method for solving the timetabling problem [37] that is already
accessible from the SIS [2] to schedule the exams sessions such that no student can have more than one
exam simultaneously.
The pseudo code for the proposed students to sessions scheduling algorithm is shown in Figure 17.
Based on that, the algorithm starts by iterating over each student in an exam instance in the loop that
starts at line 1 and ends at line 9. Then, it goes over each session in the exam instance to check for
seating availability inside the foreach loop that starts and ends at line 2 and line 8 respectively. If a seat
is available according to the check at line 3, it assigns the respective student to the current session at
line 4, increments the number of students in that session by one at line 5, and exits the loop at line 6 to
proceed and schedule any remaining students.
4.3. Grades Computation Algorithm
Based on the pseudo code for the grades computation algorithm shown in Figure 18, the algorithm
starts by iterating over all the exam instance sessions (i.e., that were defined using the screen in Figure
10) in the foreach loop that starts at line 1 and ends at line 11. Then, it iterates over the students in each
exam instance session in the foreach loop that starts and ends at line 2 and line 10 respectively. The
total grade for the corresponding student (i.e., studentTotalGrade) is initialized to 0.0 at line 3. The
13
answers of each student are considered in the foreach block that starts at line 4 and ends at line 8. Each
answer is then checked for correctness in the conditional if-statement at line 5. In case an answer is
correct, its grade is added up to the studentTotalGrade at line 6. Finally, the computed total grade for
each student is saved at line 6.
5. DATABASE DESIGN
The Entity-Relationship (ER) diagram in Figure 19 represents the main database tables used in the
EMS. Accordingly, the EXAM_TYPE table contains the definitions of the different exam types. Each
exam type can reference a language from the LANGUAGE table. Moreover, each exam type is
associated with exam instances and exam sections in the EXAM_INSTANCE and EXAM_SECTION
tables respectively. In addition, each exam section is associated with a pool of questions stored in the
EXAM_SECTION_QUESTION table. Also, the answers of each question are stored in the
EXAM_SECTION_ANSWER table. Furthermore, each exam instance is associated with sessions from
the EXAM_INSTANCE_SESSION table. Besides, each exam instance section in the
EXAM_INSTANCE_SECTION is linked to an exam instance as well as a global exam section from
the EXAM_SECTION table.
Additionally, students can be registered in an exam instance by inserting their own records in the
STUDENT table and then associating each record with the corresponding exam instance entry.
Likewise, a student can be seated in an exam instance session by linking the relevant student entry with
the desired exam instance session record. Besides, a student record gets associated with a random exam
instance form in the EXAM_INSTANCE_FORM table when the corresponding student logs into the
specific exam for the first time. Noting that each form is linked to a group of questions in the
EXAM_INSTANCE_FORM_QUESTION table. Finally, whenever a student answers a question
during a specific exam instance session, a corresponding record gets inserted in the
STUDENT_EXAM_INSTANCE_ANSWER table.
Figure 19. The ER diagram for the main database tables in the EMS
14
Figure 20. The adopted software project management and development processes
15
6.2. Three-Tier Application Architecture
The EMS is a web-based application and it is implemented using the Java EE functionalities [40]
according to a three-tier (i.e., client, web and data tiers) architecture as shown in Figure 21. The web
tier consists of the web browsers that the clients (e.g., students and instructors) use to access the EMS
features via a web-based user interface (i.e., HTML pages containing UI components) from their
laptops, desktops or mobile devices. The web-tier includes the EMS application that is deployed to a
Java EE application server running within a data center in a different location from the clients. The data
tier contains the database management system (DBMS) that manages the EMS related databases and is
hosted on a dedicated machine.
The client-side web browser uses the secure HTTPS protocol to request a desired application page
from the server-side Java EE application server. In turn, the application server passes the request to the
EMS application for processing. Then, the web container generates the needed HTML page and sends
it back to the client for display. In addition, while generating a requested page, the EMS software
components might interact with the DBMS to query or update any respective data.
The EMS application consists of JavaServer Faces (JSF) pages [41], managed beans, and Java
classes (e.g., models and utilities). A JSF page may contain HTML, JSF and PrimeFaces [42] elements.
A managed bean is a special Java class that is managed (i.e., instantiated, updated and destroyed) by
the JSF framework, is used to save the state of each relevant JSF page, and contains methods that
implement the application logic. The managed bean methods use Data Access Objects (DAOs) that in
turn use the Java Database Connectivity (JDBC) API [43] to query or update any corresponding data in
the related database tables. Furthermore, the application may utilize any of the services that are readily
available in the JSF framework such as data conversion and validation, as well as internationalization
and error handling.
Figure 22. The flow chart for the student multifactor authentication and authorization scheme
16
Figure 23. The firewall and network configuration in which the EMS is deployed
7. SECURITY SCHEMES
A multifactor procedure for authentication and authorization, as well as several security schemes
for exam rooms, servers and network are adopted to protect the EMS application and infrastructure
from any unauthorized access as explained in the next subsections.
7.1. Multifactor Authentication and Authorization
The flow chart for the very strong multifactor authentication and authorization scheme that is
adopted in the EMS to secure students login to online exams is shown in Figure 22. The method is based
on four major steps and gains its strength from relying on information stored in different databases (i.e.,
the Microsoft Active Directory (AD) [44], SIS, AIS and EMS databases). Accordingly, the four used
steps are as follows:
Authentication: Students are authenticated based on their Single Sign-On (SSO) credentials (i.e.,
username and password) that they also use to login to other GJU applications (e.g., MyGJU portal, E-
Mail, E-Learning). The Microsoft AD is utilized to store and verify the login credentials of the GJU
students and staff members. Hence, the EMS uses the Java Naming and Directory Interface (JNDI) [45]
to communicate with the AD via the Lightweight Directory Access Protocol (LDAP) [46]. Furthermore,
as an extra security step, a student is given five attempts to enter the correct credentials in this step,
otherwise login fails and the relevant account gets blocked for several minutes. In addition, a strong
password policy is enforced on all students to circumvent compromising their accounts.
Authorization: Upon successful authentication, the system verifies whether a student is eligible to take
an exam. Correspondingly, it verifies if the student has an active account in the SIS database. Moreover,
it checks if the student is scheduled for an exam that overlaps with the login moment based on the
information in the EMS database. In addition, it makes sure that the student had paid the relevant exam
fees according to the AIS database. Based on that, a student is denied access to the system in case any
of the previous checks fails.
Security Question Validation: As an extra security measure, a student is given a maximum of five
attempts to answer a random security question related to the student information (e.g., secondary Email,
birthdate, etc.) in the SIS database.
Session Password Validation: In this last step, the student needs to correctly enter the provided exam
instance session password to successfully complete the login process. An exam instance session
password is automatically generated by the EMS (see Figure 10) and is only disclosed by the instructor
at the beginning of the corresponding session in the relevant exam room. Hence, if students try to attend
an exam in a room other than the one that is designated to their exam session, their login fails as the
given password for a not assigned room will not match. Accordingly, the previous check forces students
to attend their exams only in the assigned rooms. Upon successful login, the EMS first assigns randomly
a form of the exam to the respective student. Then, it navigates to the exam start page in the student
view.
17
7.2. Exam Room Security
For security reasons, the online exams are only conducted in dedicated computer laboratories (i.e.,
exam rooms equipped with desktops). Furthermore, several security procedures are also followed in the
exam rooms to prevent cheating. For instance, student impersonation is difficult as an exam proctor can
conveniently verify the student identify by viewing the student picture and information in the student
information screen (shown in Figure 15) that is accessible in the click of a button (i.e., Student
Information button) from the student exam wizard (shown in Figure 13). Also, the possibility of students
cheating in an exam is very low as the EMS can assign a different form of the exam to each student.
Furthermore, mobile devices are prohibited and Internet access is blocked in the computer laboratory.
Moreover, students cannot leave the laboratory completely until the end of the grace time that is given
to students who report late to the exam room.
7.3. Servers Security
The following security measures are taken to protect the EMS and database servers from any
unauthorized access: strong passwords are enforced for server and application administrators, the
security-centric Linux operating system (OS) is used to operate all servers, all unnecessary services are
turned off on all servers, all used software (i.e., OS and applications) are patched regularly to close any
vulnerabilities, and anti-virus software is executed and updated periodically on all servers.
7.4. Network Security
The network security policy rules are configured on the firewall (see Figure 23) in order to: block
access from the Internet to the EMS and database servers that reside within the GJU local area network
(i.e., GJU Intranet), allow database and system administrators to access the EMS and database servers
only from within the GJU Intranet, and permit access to the EMS views (i.e., student and instructor
views) only from the GJU Intranet via the secure HTTPS protocol service port. Accordingly, neither
students nor instructors may access the EMS from outside the GJU Intranet. However, in very few cases
and only when needed, some instructors can be granted access to the EMS from the Internet (i.e., from
the outside of the GJU Intranet) via a secure VPN connection.
8. VALIDATION AND RESULTS
The EMS deployment results as well as the user surveys that were conducted to evaluate the
software quality attributes of the student and instructor EMS views are discussed in the next subsections.
8.1. Deployment Results
The EMS was used to setup and conduct four instances of the Arabic proficiency exam in the
following semesters: first 2016/2017 (with a total of 740 students), first 2017/2018 (with a total of 769
students), second 2017/2018 (with a total of 218 students) and first 2018/2019 (with a total of 767
students) as summarized in Figure 24. Furthermore, it will be used to conduct mathematics and
engineering online exams starting from the second 2018/2019 semester. For example, the first
2018/2019 exam instance was distributed over 22 sessions as shown in Figure 25. Accordingly, the
sessions were held on September 16 (6 sessions), 17 (6 sessions), 18 (6 sessions), 24 (2 sessions), 25 (1
session) and 26 (1 session), 2018. Moreover, the maximum number of simultaneous sessions per day
was 2 sessions assigned in rooms H501 and H503. Also, the maximum number of students per session
was 63 on September 26, whereas the maximum number of students per day was 212 on September 17.
18
Figure 24. The number of students who took the Arabic language proficiency online exam in four different instances
Figure 25. The Arabic proficiency online exam instance sessions for the first 2018/2019 semester
8.2. Student and Instructor Surveys Results
Two user surveys related to the student and instructor EMS views were conducted at the beginning
of the first 2018/2019 semester. The answers to each question in both surveys were based on the five
points Likert scale [47]. Accordingly, each question has the following five answers: Strongly Agree (5
points), Agree (4 points), Neutral (3 points), Disagree (2 points) and Strongly Disagree (1 point). The
student survey questions shown in Table 2 focused on the following software quality attributes: usability
(question 1 and question 2), reliability (question 3), performance (question 4) and cutting edge (question
5). On the other hand, the instructor survey questions shown in Table 3 evaluated the following software
quality attributes: usability (question 1), functionality (question 2), reliability (question 3), performance
(question 4) and availability (question 5). Furthermore, the number of each answer selections for every
question, total answers for each question, and average score for each question are reported in Table 2
and Table 3 according to the responses of students and instructors respectively.
19
Table 2. The students survey results
Answers Total Average Standard Confidence
# Question 5 4 3 2 1 Answers Score Deviation Interval
1 The exam system is easy to use 447 253 40 8 19 767 4.435 0.84 0.059
2 The system messages as well as
the My Answers and Exam
Instructions screens were
informative and useful 403 260 72 13 19 767 4.323 0.89 0.063
3 I did not encounter any
technical issues with the exam
system 479 214 33 23 18 767 4.451 0.89 0.063
4 The exam system allows
answering questions and
navigating between screens
smoothly and quickly 474 237 35 8 13 767 4.500 0.78 0.055
5 The exam system is considered
advanced compared to similar
applications that I have used
before 344 259 123 25 16 767 4.160 0.95 0.067
All Questions 2147 1223 303 77 85 3835 4.374 0.88 0.062
Table 3. The instructors survey results
Answers Total Average Standard Confidence
# Question 5 4 3 2 1 Answers Score Deviation Interval
1 The EMS features are easy to
use 3 1 1 0 0 5 4.400 0.80 0.057
2 The EMS supports the needed
features to manage online
exams 2 2 0 1 0 5 4.000 1.10 0.077
3 I encountered very few
technical issues with the EMS 3 1 1 0 0 5 4.400 0.80 0.057
4 The EMS is responsive and fast 3 2 0 0 0 5 4.600 0.45 0.034
5 The EMS is accessible any time 5 0 0 0 0 5 5.000 0.00 0.000
All Questions 16 6 2 1 0 25 4.480 0.81 0.057
Based on that, 767 students and 5 instructors participated in the respective surveys. Furthermore,
most students strongly agreed or agreed that the exam system: is easy to use (1363 out of 1534 related
responses i.e., 89% of the responses), is reliable (693 out of 767 related responses i.e., 90% of the
responses), is fast (711 out of 767 related responses i.e., 93% of the responses), and is cutting edge
compared to similar systems that they have used before (603 out of 767 related responses i.e., 79% of
the responses). On the other hand, the majority of instructors strongly agreed or agreed that the EMS:
is easy to use (4 out of 5 related responses i.e., 80% of the responses), supports the needed functionality
(4 out of 5 related responses i.e., 80% of the responses), is reliable (4 out of 5 related responses i.e.,
80% of the responses), is fast (5 out of 5 related responses i.e., 100% of the responses), and is highly
available (5 out of 5 related responses i.e., 100% of the responses). In both tables, the standard deviation
values illustrate that the answer score remains above three for all questions (except question 2 in Table
3 whose results are not very accurate due to the low sample size) at the lower side from the average
score, which is still acceptable. In addition, all confidence interval values were negligible for all
questions, which asserts the accuracy of the results.
9. SUMMARY AND CONCLUSIONS
This paper discussed several aspects related to the analysis, design, development, integration,
deployment and security of a web-based examinations management system that has been developed in-
house at the GJU. Accordingly, the systems engineering basic profile in the ISO/IEC 29110 series was
adopted in the EMS project management and software development processes. In addition, the EMS
was implemented according to the Java EE three-tier architecture. Furthermore, the system is feature
rich and state-of-the art as it supports capabilities such as:
20
Reduced exams setup as a lot of the needed setup information (e.g., buildings, rooms, students,
etc.) is automatically available for use in the EMS due to its integration with the SIS and AIS.
The definition of a flexible exams structures that is comprised of exam types, exam sections, a
questions pool per exam section, exam instances for each exam type and sessions for every exam
instance.
The composition of an exam is possible in either the Arabic or English language.
A rich text editor that facilitates composing beautifully formatted multiple choice questions and
answers that may also contain mathematical equations, web links, images and videos.
Generating many forms of an exam.
Automatically assigning students to exam instance sessions.
Transferring some of the already scheduled students from one session to another when needed.
Computing grades in the click of a button.
Producing several types of reports in different formats.
An easy to use and informative exam wizard that enables students to answer and navigate between
questions, display student information, view exam instructions and resume exams in case of
failures (e.g., browser crash and short Internet outage).
The EMS functionality besides its integration with different systems and databases enabled the
implementation of the following security schemes:
User authentication based on SSO credentials that are stored in the Microsoft AD database.
Student authorization according to account, scheduling and financial information accessible from
different databases (i.e., the SIS, EMS and AIS databases respectively).
Forcing students to attend an exam only in the assigned rooms by having instructors disclose each
automatically generated session password at the beginning of the corresponding sessions in the
designated exam rooms.
Capability to view the student picture and information in the student exam wizard at any time to
prevent student impersonation.
Possibility to assign each student a different form of the exam to reduce the possibility of
cheating.
Furthermore, the deployment results illustrated that the EMS has been successfully used at GJU to
organize economic paperless online exams in several semesters over the past three academic years.
Accordingly, the instructors were relieved from the time consuming and cumbersome process to
manually grade the paper-based exams. Moreover, the conducted user surveys demonstrated that the
majority of students and instructors believe that the EMS views are user friendly, cutting edge, capable,
reliable, fast and highly available.
As far as future work, biometric authentication (i.e., fingerprint or face recognition) methods in
addition to more question types (e.g., short answer, essay and missing words) need to be considered to
improve system security and capability respectively.
REFERENCES
1. F. Al-Hawari. MyGJU student view and its online and preventive registration flow. International Journal of Applied
Engineering Research, 2017, 12(1):119-133.
2. F. Al‐Hawari, A. Alufeishat, M. Alshawabkeh, H. Barham, and M. Habahbeh. The software engineering of a three‐
tier web‐based student information system (MyGJU). Computer Applications in Engineering Education, 2017 Mar,
25(2):242-263.
3. F. Al-Hawari. Analysis and design of an accounting information system. International Research Journal of
Electronics and Computer Engineering, 2017 Jun, 3(2):16-21.
4. M. Kuikka, M. Kitola, and M.J. Laakso. Challenges when introducing electronic exam. Research in Learning
Technology, 2014 Aug, 22:1-17.
5. G. Rabbany, and M.O. Rashid. Online learning and exam system. Senior Project Report, Daffodil International
University, Dhaka, Bangladesh, 2017.
6. A. Fluck, H. Pálsson, M. Coleman, M. Hillier, D. Schneider, G. Frankl, and K. Uolia. eExam symposium: design
decisions and implementation experience. In IFIP World Conference on Computers in Education, 2017, pp 3-6.
21
7. C. Vafeiadou, P. Vasiloudis, and M. Dasygenis. Online automatic examination system for digital circuits. In IEEE
5th International Conference on Modern Circuits and Systems Technologies (MOCAST), 2016, pp 1-4.
8. D. Xiaoyu, and L. Yunhao. Research and implementation of web-online English testing system. In IEEE
Information Science and Engineering (ISISE), 2010 Dec, pp 430-433.
9. M.Z. Rashad, M.S. Kandil, A.E. Hassan, and M.A. Zaher. An Arabic web-based exam management system.
International Journal of Electrical & Computer Sciences IJECS-IJENS, 2010 Feb, 10(1):48-55.
10. A. Tufekci, H. Ekinci, and U. Kose. Development of an internet-based exam system for mobile environments and
evaluation of its usability. Mevlana International Journal of Education. 2013 Sep, 3(4):57-74.
11. G. Meletiou, I. Voyiatzis, V. Stavroulaki, and C. Sgouropoulou. Design and implementation of an E-exam system
based on the Android platform. In IEEE 16th Panhellenic Conference on Informatics, 2012 Oct, pp 375-380.
12. K.C. Gramoll. Development and implementation of a tablet-based exam app for engineering course. In Proc. ASEE
Annual Conf., Seattle, WA, 2015 Jun, pp 1-12.
13. H. Pang, S. Yang, and L. Bian. A web services based online training and exam system. In IEEE 4th International
Conference on Wireless Communications, Networking and Mobile Computing, 2008. WiCOM'08, 2008 Oct, pp
1-4.
14. H. Darong, and H. Huimin. Realization and research of online exam system based on S2SH framework. In IEEE
International Conference on Web Information Systems and Mining (WISM), 2010 Oct, pp 396-399.
15. Y. Sun. Design of online examination system based on SSH framework. In 2015 International Conference on
Advances in Mechanical Engineering and Industrial Informatics. 2015 Apr, pp 366-369.
16. M.S. Al-Hakeem and M.S. Abdulrahman. Developing a new e-Exam platform to enhance the university academic
examinations: the case of Lebanese French University. International Journal of Modern Education and Computer
Science, 2017 May, 9(5):9-16.
17. C. Ramanathan, S. Banerjee, and N.J. Rao. OAES: Scalable and secure architecture for online assessment and
evaluation system. In IEEE 4th International Conference on MOOCs, Innovation and Technology in Education
(MITE), 2016, pp 296-301.
18. H. Bardesi and M. Abdel Razek. Learning outcome E-Exam system. In Sixth International Conference on
Computational Intelligence, Communication Systems and Networks, 2014 May, pp 77-82.
19. M. Yağci and M. Ünal. Designing and implementing an adaptive online examination system. Procedia Social and
Behavioral Sciences. 2014 Feb, 116:3079-3083.
20. M.I. Younis and M.S. Hussein. Construction of an online examination system with resumption and randomization
capabilities. International Journal of Computing Academic Research (IJCAR), 2015 Apr, 4(2):62-82.
21. M. Hillier, S. Grant, and M. Coleman. Towards authentic e-Exams at scale: robust networked Moodle. Working
paper, Monash University, 2018.
22. D. Draskovic, M. Misic, and Z. Stanisavljevic. Transition from traditional to LMS supported examining: A case
study in computer engineering. Computer Applications in Engineering Education. 2016 Apr, 24(5):775-786.
23. R.M. Borromeo. Online exam for distance educators using Moodle. In IEEE 63rd Annual Conference International
Council for Educational Media (ICEM), 2013 Oct, pp 1-4.
24. Moodle. Available from https://moodle.org/ [last accessed December, 2018].
25. N.A. Karim and Z. Shukur. Review of user authentication methods in online examination. Asian Journal of
Information Technology, 2015, 14(5):166-175.
26. O. Adebayo, and S.M. Abdulhamid. E-exams system for Nigerian universities with emphasis on security and result
integrity. International Journal of the Computer, the Internet and Management (IJCIM), 2014 Feb, 2(18):1-12.
27. T. Ramu, and T. Arivoli. A framework of secure biometric based online exam authentication: an alternative to
traditional exam. Int J Sci Eng Res. 2013 Nov, 4(11):52-60.
28. M. Kaiiali, A. Ozkaya, H. Altun, M.H. Haddad, and M. Alier. Designing a secure exam management system
(SEMS) for M-learning environments, 2016, 9(3):258-271.
29. I.Y. Jung, and H.Y. Yeom. Enhanced security for online exams using group cryptography. IEEE transactions on
education, 2009 Aug, 52(3):340-9.
30. M. Mathapati, T.S. Kumaran, A.K. Kumar, and S.V. Kumar. Secure online examination by using graphical own
image password scheme. In IEEE International Conference on Smart Technologies and Management for
Computing, Communication, Controls, Energy and Materials (ICSTM), 2017 Aug, pp 160-164.
31. S. Idemudia, M.F. Rohani, M. Siraj, and S.H. Othman. A smart approach of E-Exam assessment method using face
recognition to address identity theft and cheating. International Journal of Computer Science and Information
Security. 2016 Oct, 14(10):515-522.
32. A. Fayyoumi and A. Zarrad. Novel solution based on face recognition to address identity theft and cheating in
online examination systems. Advances in Internet of Things, 2014, 4(2):5-12.
33. I. Traoré, Y. Nakkabi, S. Saad, B. Sayed, J.D. Ardigo, and P.M. de Faria Quinan. Ensuring online exam integrity
through continuous biometric authentication. In Information Security Practices, Springer International Publishing,
Switzerland. 2017, pp 73-81.
22
34. R. Giustolisi, G. Lenzini, and P.Y. Ryan. Remark!: A secure protocol for remote exams. In Cambridge International
Workshop on Security Protocols, 2014 Mar, pp 38-48.
35. O. Zughoul, H.M. Jani, A. Shuib, and O. Almasri. Privacy and security in online examination systems. International
Organization of Scientific Research-Journal of Computer Engineering (IOSRJCE), 2013 Mar, 10(4):63-70.
36. CKEditor 4. Available from https://ckeditor.com/ckeditor-4/ [last accessed December, 2018].
37. F. Al‐Hawari, M. Al‐Ashi, F. Abawi, and S. Alouneh. A practical three‐phase ILP approach for solving the
examination timetabling problem. International Transactions in Operational Research, in press, 2017.
38. C.Y.Laporte, R.V. O’Connor, and L.H. Paucar. The implementation of ISO/IEC 29110 software engineering
standards and guides in very small entities. In International Conference on Evaluation of Novel Approaches to
Software Engineering. Springer International Publishing. 2015 Apr, pp 162-179.
39. W. Royce. Managing the development of large software systems. Proceedings of IEEE WESCON. 1970 Aug,
26(8):382-338.
40. J. Juneau. Introducing Java EE 7: A look at what's new. Apress, USA, 2013.
41. J. Juneau. JavaServer Faces: Introduction by example. Apress, USA, 2014.
42. M. Caliskan, and O. Varaksin. PrimeFaces cookbook. Packt Publishing, United Kingdom, 2015.
43. S. White. JDBC API tutorial and reference: universal data access for the Java 2 platform. Addison-Wesley
Longman Publishing Co., Inc., USA, 1999.
44. D. Iseminger. Active directory services for Microsoft windows 2000. Microsoft Press, USA, 1999.
45. Lee, R, Seligman, S. The JNDI API tutorial and reference: building directory-enabled Java applications. Addison-
Wesley Longman Publishing Co., Inc., USA, 2000.
46. K. Zeilenga. Lightweight directory access protocol (LDAP): Technical specification road map. 2006.
47. R. Likert. A technique for the measurement of attitudes. Archives of psychology. 1932, 22(140):1-55.
23