Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
17 views11 pages

Web Based Student Progress Monitoring System

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views11 pages

Web Based Student Progress Monitoring System

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 1

Academic Project Monitoring


Using a Web-based System
Cameron C. Gray, Dave Perkins, Ludmilla I. Kuncheva Member, IEEE

Abstract—Undergraduate and postgraduate students take part in project work designed with broad scopes. These projects encourage
the student to showcase their abilities along with time and project management skills. Timescales for these types of projects are
necessarily longer, meaning that keen supervision can be required to ensure efforts are appropriately applied. We present a web-
based pedagogical tool and workflow to support effective monitoring of these students within the UK and Western European Higher
Education systems. This workflow includes an oversight capability to handle the objections and unavoidable subjectivity that can arise.
Ultimately, the process and tool are designed to aid in allowing the students to achieve their potential, improving success rates when
measured against the learning outcomes. The authors believe that the tool and workflow are abstracted enough to be used in any
project-based review process.

Index Terms—Student Progress, Monitoring Tool, Student Support

1 I NTRODUCTION

S UPPORTING longer-term student projects can present


new challenges to educators. The student is expected
to maintain interest and effort over an extended period
4) Two small-scale case studies, using both undergrad-
uate and postgraduate students at Bangor Univer-
sity.
in addition to their usual study loads. The onus, quite 5) An evaluation proving efficacy to the research stu-
rightly, is placed upon the student to manage their time dent project and wider applications where regular
and resources appropriately. Previous work [1] has already and repeated monitoring is required.
shown the multitude of benefits of attentive monitoring,
ensuring every student reaches their potential. The rationale
for such a process could be reduced to common sense: the 2 R ELATED W ORK
earlier an instructor is aware of issues, the sooner they can Most student monitoring systems are designed to measure
intervene. the students’ progress holistically within the scope of any
Each academic has developed a unique style and strat- given course. Some of these are web- based [4], [5], others
egy for maintaining a watchful eye over their student traditional desktop applications [6], [7]. There are exam-
projects. These practices are often separate from (often) cen- ples of systems developed to solve particular educational
tralized pastoral care systems. The process is often opaque issues, such as teaching languages to allow access to more
and accountable to no-one [2], [3]. Only when a deliverable, educational options [8]. Significant effort has already been
be it the final product or a milestone, is submitted is any out- spent on creating visualizations to track students’ progress
side opinion solicited. With long-running projects (such as throughout their careers [9], [10], [11], [12], [13]. Almost all
postgraduate research degrees, or final year dissertations), of these, however, are based on existing metrics and data
this can lead to wasted effort and upset on both sides of the available within an institution. They are primarily designed
supervisory relationship. With undergraduate projects, the to find or emphasize patterns that are not immediately
scope for going awry is less but significant disagreements apparent to educators.
can still occur. (Most disagreement with undergraduate When dealing with student project work, there is a
projects is between supervisor and second marker, rather significant body of research stating that careful monitoring
than student and supervisor.) is required [14], [15], [16]. However, there are fewer mecha-
This paper presents the following contributions: nisms proposed to deal specifically with project work. There
have been suggestions for on- and off-line systems [17],
[18], however, these usually have an ulterior motive such
1) An examination of the existing process and solu-
as removing undesirable behavior traits or improvement in
tions in use in the UK and Western Europe.
assessing group efforts.
2) A rationale and benefits of altering this process.
Drummond and Boldyreff proposed an entirely virtual
3) A web-based tool to support the improved monitor-
environment [19] to contain the project work. This approach
ing process.
allows monitoring and gathering of statistics on almost
every interaction with the system. This concept certainly
• C. Gray, D. Perkins and L. I. Kuncheva are with the has merit but is not applicable in every situation. In the
School of Computer Science, Bangor University, UK. e-mail: suggested deployment of their tool, programming, almost
{c.gray,d.perkins,l.i.kuncheva}@bangor.ac.uk. all work can be measured by the interactions and result.
Manuscript received June 19, 2016. However, in research and when teaching abstract concepts,
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 2

the important detail is lost as the virtual environment cannot and the committee’s recommendation (progress, fail or
capture it. For example, the number of papers, websites or downgrade to a lower qualification). After being agreed and
other electronic resources used could easily be recorded. It signed by the members of the committee and the student,
is not possible to track offline resources; nor whether the the report is submitted to the institution. The tone and
student read, or understood the material(s). tenor of this return could be colored (either way) by the
Research into similar systems, dubbed Intelligent Tu- relationship [2]. Most versions of the forms asked for an
toring Systems, was in vogue during the late 1990s to impression of progress but did not require evidence to be
mid-2000s. These systems attempted to emulate the re- submitted to justify those perceptions. Therefore, no proof
sponse and adaptive qualities of live teachers [20], [21], of a truly objective assessment of achievement or skill can
[22]. While this form of a system may well be successful be shown. As a result, the report could be prepared in a one-
in courses with narrowly defined curricula; the range of sided manner. If or when disagreements occur the student
responses required to handle project work is simply too may feel they have little option than to avail themselves of
vast to model in one of these systems. Another drawback the formal appeals process further fuelling a deterioration
to a fully automated/electronic system is that students of the relationship.
can learn to game the system [23]. A corollary is where
students apply a strategic learning style [24] to pass a
4 B ENEFITS OF R EVISING THE M ONITORING
module/test/examination. Without the ‘human in the loop’,
this gaming can go unchecked. In a project context, the P ROCESS
result is weakened oversight, and arguably more aspects Any progress monitoring will necessarily include an ele-
that could go awry despite the application of technology ment of evaluation, of the student and deliverables. It is
and appropriate reported progress. natural in these situations that students feel some level of
performance anxiety [25], [26]. Using an environment that is
familiar to the student can help lessen that anxiety. Higher
3 T HE W ESTERN E UROPEAN R ESEARCH Education institutions routinely use Virtual Learning Envi-
M ONITORING P ROCESS ronments (VLEs) such as Moodle, Blackboard LearnTM and
This section most readily applies to Ph.D. and M.Sc. (Res) similar. Utilizing these systems for assessment is not a new
programs. There will be slight variations by institutional and concept [27], [28], [29]. Rarely are such systems used for con-
subject. Elements can be transposed into undergraduate or taught trol or assessment of research and/or project activities. Such
M.Sc. theses. systems are used as a repository for marks and feedback
to the student as well as for plagiarism detection. As more
Institutions in the UK and Western Europe share a sim- digital natives embark on their higher education careers, the
ilar annual, or semi- annual review process for monitoring use of technology will be far more accepted than ever before
progress and setting goals. We base this assertion on per- [30]. This familiarity may help lessen the anxiety associated
sonal communications between the authors and academics with the progress review when students are compiling their
from University of Cagliari, Italy; Delft University of Tech- responses.
nology, Netherlands; Technical University of Sofia, Bulgaria. In these longer term projects, academics tend toward
The United Kingdom perspective1 comes from the authors’ using a report such as a dissertation or a thesis as the
first-hand knowledge of the Bangor University process. primary deliverable and use this for assessment. The think-
ing behind this choice is to incorporate transferable and
Each period, either semi-annually or annually as defined
communication skills into the evaluation of student per-
by local policy, the student is required to prepare a report
formance. This practice is reasonable and filled with good
detailing activities in that period. The prompts provided,
intentions. It does mean, however, the assessment becomes
on forms where used, are largely the same: ‘what have
entirely summative. The workflow and tool presented are
you achieved?’, ‘what progress have you made?’, ‘what
not arguing to remove this form, but to supplement it with
challenges have you faced?’, and ‘what is the plan for the
formal, regular and transparent monitoring meetings.
next period?’. These questions suggest a reflective approach,
Any monitoring tool/process need not solely report
more than a coldly analytic one. Once the report has been
students’ progress; it can be another channel for feedback
prepared, the student will meet formally with their supervi-
to benefit both sides of the relationship. A previous study
sory committee. The student’s immediate supervisor(s), any
spanning three years found that students do value their
co-supervisors, and any other advisers assisting with the
tutors’ feedback but will only spend a limited time on it [31].
endeavor have a seat on this committee. The student may
In a project setting this translates into requiring more, but
be asked to present their report orally, or it may take the
smaller, ‘chunks’ of feedback to digest at regular intervals.
form of a Q&A session.
A methodology using this suggestion can help support the
After this meeting, the supervisory committee will pre-
existing supervisory process with plans and guidance at
pare a written report detailing the student’s performance
every stage. Research commissioned by the Higher Educa-
1. The substantive portions of the UK view are also tion Academy [32] highlights the necessity of feedback to
corroborated by various policy documents, such as http: nurture and encourage a partnership approach to release a
//www.ncl.ac.uk/fms/postgrad/documentation/documents/ student’s potential. Chao [16] has found that electronic tools
Resstudenthandbook2014-15.pdf §4, http://www.abdn.ac.uk/ can be particularly effective in this regard.
cops/graduate/assessment-process-255.php, and http://www.
gcu.ac.uk/graduateschool/postgraduatestudy/phdstudyatgcu/ An unintended, but highly desirable, side-effect of re-
researchstudentprogressionforms/ quiring regular progress reports is that students will become
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 3

Student Student
Completes Completes
Review Form Survey
Chair Reviews Is Meeting
No
Review Created Required
Submissions
Supervisor
Completes
Review Form Yes

Chair Finalises Formal Meeting


Decision Held

Student and
Supervisor Agree Comment Stage
Outcome/Actions

Optional
Override
Chair Signs-off
Review

Fig. 1. Flowchart showing the proposed monitoring process. See Section 5 for full descriptions of each stage. The ‘meeting required’ decision is
made in concert with the participants of the review rather than by any one party.

more reflective about their experiences. Previous investiga- To make the process as straightforward as possible, we
tions [33], [34] corroborate this assertion. As a consequence, have implemented the system using familiar web-based
students are more aware of their progress and accomplish- concepts. The staff and students should already be familiar
ments as well as where issues reside. As the student is with these constructs as they are becoming more ubiquitous
already aware of potential failings, this can lessen the impact every day. Using these constructs means there is not another
of some negative feedback. However, the accepted devia- set of controls, concepts, and applications to learn. This
tions regarding self-assessment must be taken into account familiarity should cause less anxiety, common with new sys-
[35]. The critical item from the cited list is that stronger stu- tems, helping the process bed in rather than users pushing
dents are harsher on themselves whereas weaker students, against it. However, there will always be segments of the
typically, overrate their work. student body that find using a web environment difficult.
Within Bangor University, previous solutions have been This effect may be particularly acute for mature students or
unable to increase transparency. While the student con- those from lower socio-economic backgrounds who are not
cerned had input, the report from their supervisor was the exposed to technology as much.
final word. A meta-analysis [36] of 119 individual stud-
ies has found that the teacher-student relationship is one
of the highest-impact elements that a teacher can control 5.1 Process Description
driving student achievement. To be transparent, a method 5.1.1 Review Creation and Form Submission
that allows two-way constructive criticism is required. Such Originally the authors envisaged a member of administra-
a system is necessary for ensuring continued viability of tive or support staff performing the initial creation of the
longer-term projects. review for the appropriate students. However, there is no
reason that each students’ primary supervisor could not
5 T HE P ROPOSED TOOL AND W ORKFLOW carry out this task instead. The creation step simply involves
The Project Progress Monitoring System is designed to be selecting the program, phase of study, the form submission
a non-intrusive method bringing transparency and outside deadline and a provisional date for the review meeting
corroboration to projects at any level. Figure 1 shows, (assuming it is required). There are no restrictions on when
graphically, the proposed workflow; we explain each step reviews may be scheduled. Most institutions’ regulations
in greater detail in this section. stipulate that a review must occur at least once a year. By
The most obvious deviation from existing processes accommodating more than one Review during any phase of
is that we recommend adding a chairperson. This partici- study supervisors are given a level of flexibility to schedule
pant is expected to be objective and ideally impartial. The reviews according to need, events, and ability.
chairperson acts as a moderator between the student and Once created/scheduled, the student is asked to com-
their supervisor. They hold the casting vote on whether plete two forms each with a different purpose. The first
the student has demonstrated sufficient progress or not. form provides the student the opportunity to catalog
Any candidate for chairperson would be part of the same their achievements, progress made, challenges faced, and
field of study, but not directly involved with the work. In goals for the next period (however long that may be).
smaller departments/schools/groups or those with a close- The student may (optionally) upload any supporting ev-
knit community, suitable candidates may be difficult to find. idence they deem suitable. The second form named the
In these cases, we would expect the head of the community ‘Confidential Survey’, offers a quantitative and qualita-
to make an executive decision in appointing a chair. tive view into the student experience. It is intended
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 4

to be a ‘safe space’ for the student to make observa- 5.1.3 Action Plan/Comments
tions and, if necessary, concerns about their situation. For The output of any review is an Action Plan and formal deci-
the school/department/college/institution, these responses sion on progression. The Action Plan, as its name suggests,
provide invaluable feedback when examined in an aggre- is the formally agreed next steps. The suggested workflow
gate context for example, on a cohort, program or school provides for four categories of person that can have actions
level. In some regards, ‘Confidential Survey’ is a misnomer. assigned: student, supervisor, school and other. These four
It is only intended to be confidential from the primary have been selected in recognition of the fact that despite
supervisor so that the student has a mechanism to report participants’ best efforts, other external factors may result
critically on the working relationship. The survey is not in a poor outcome in a review or of the project. Examples of
meant to be confidential from the institution or chairperson. actions on a school could revolve around improved access
The authors would expect anyone appointed as a chair- to resources, library materials or clarification on process,
person to abide by professional ethics in not revealing the regulations or assessment criteria. The ‘other’ actor is in-
contents of the survey unless there is anything of extreme cluded for situations where there may be institution-wide
concern. changes or clarification required, where projects include an
The supervisor is asked to complete a different form, industrial, professional or clinical partner or any other third-
with prompts tailored to the evaluation of the stu- party needing to complete actions to ensure the ongoing
dent’s performance. Evaluations use a four-point rubric; success of the project.
met/exceeded expectations, some improvement needed, Rarely, in assessment matters, are things completely
major improvement required and unacceptable. These black and white. Research and other longer-running project
rubrics are intended to be generic; each school/department work present a particular challenge as multiple skills are
is then left to produce a form of words describing what judged at once. To balance the power assessors wield in
each element means within the realm of their programs. these situations, any of the participants have the ability to
Alongside each rating prompt, the supervisor must provide make comments on the process. The system stores these
a rationale for the selection. Without this provision, the remarks alongside the submissions and decisions; allowing
chairperson would be presented with a verdict but no evi- objections and corrections to be noted. The authors envisage
dence. We expect that students would use the upload feature situations where the participants disagree on the scope,
to balance out any counter claims. Neither the student nor wording, or presence of items on the action plan. It remains
the supervisor is permitted to access the initial submissions within the chairperson’s option to adjust the Action Plan or
of the other as the returns are intended to be independent. formal decision based on this new information.

5.1.4 Agreement and Sign-Off


5.1.2 Chairperson Review/Preliminary Decision
As with almost any formal procedure, the review process
The chairperson receives a notification to review the initial ends with an agreement stage. Usually with paper-based
returns when they are complete. The chairperson is asked to systems, and more than a few electronic ones, this agree-
complete an identical form as the primary supervisor had. ment takes the form of a signature. With a web-based solu-
As the impartial observer, their ranking of performance is tion, physical signatures are not a viable option. A standard
drawn from the evidence provided by both the supervisor digital solution is to replace the physical signature with
and the student. The chairperson is also required to provide some form of encryption that provides a ‘digital signature’.
the rationale for their selections. It is anticipated that the However, these technologies require more knowledge and
chairperson will make reference to all three submissions in cyber-security skills than an average lay-person possesses.
their rationales. An alternative approach is to use a tried and tested
Assuming the best-case scenario, both the student and mechanism that all modern computer users are familiar
supervisor broadly agree on progress, and there are no with, their username and password. By making the agree-
significant concerns raised in the Confidential Survey; the ment a ‘secured’ positive action, the participant is forced
chairperson can form a preliminary decision and suggests to to log into the system to perform the agreement action to
the other participants that a formal meeting is not required. signify their assent. Any users’ credentials are supposed to
If accepted by all; the preliminary determination stands, the remain secret meaning that we can reasonably assume the
chairperson enters the student’s plans as the Review Action person making the confirmation is whom they say they are.
Plan, and the process moves on to the approval/sign-off (There are wider implications for a student and the institu-
stage. tion if any users’ credentials are misappropriated.) Asking
In the event of disagreement, doubts, mismatches, or an identified/authenticated user to perform an action that
concerns regarding the supervisory relationship chairper- not automatically completed has been held to provide a
sons are actively encouraged to hold the Formal Review suitable analog to a signature [37]. While approval is still
Meeting to explore these issues fully. There is no prescribed pending; all participants are still permitted to comment. The
format for these meetings as this can vary with the type chair is authorized to alter the Action Plan in response. If
of problems identified. As with all aspects of assessment, they choose to do so, all previous approvals are reset. This
the best practice is to form objective views of progress and protection is to prevent a situation where someone who had
performance. Ultimately, this would be resolved using the previously agreed no longer agrees, due to the changes.
chairperson’s professional judgment based on the submitted Assuming that all participants agree to the decision
evidence and in-person representations from all concerned. and action plan as is, they would signify their approval
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 5

and the system notifies the chairperson that the review is 6.1 Form Versions
ready for sign-off. Once a review has been signed-off, no Every Review will contain the three forms, but these must
further changes or comments are permitted. However, if vary by School and over time. The concept of Form Versions
one participant is proving to be particularly intransigent, are used to accommodate this variance; a simplified Entity
the chairperson is authorized to force the issue, overriding Relationship Diagram illustrates this in Figure 2.
the objections. This ability is required to diffuse a situation
where a student refuses to give approval, most likely as it
would result in a poor outcome, to prolong the process. If Form
School Question
this occurs, a comment is added to the review attesting to Version
the override, the review is signed-off, and all participants
notified.
Review

6 TOOL D ESIGN
Fig. 2. Entity Relationship Diagram showing Form Version relations.
Most, if not all, interfaces with other systems and technologies
have been configured to use Bangor University’s IT infrastructure. Using the versions concept; the system can recreate the
However, in most cases it would be trivial to adjust for another form as was in use at the time - irrespective of how much
institution assuming standards compliance. time has passed or how many subsequent versions there are.
Administrators may mark versions as ‘available for copy’,
The largest consideration in the design of a project allowing others to base their form on this shared or reference
monitoring system is the management of the sheer quantity copy. The shared versions are then made available to all
of differences among the various projects and levels. The departments/schools to base their on form versions on. Sim-
simplest solution to this issue is to make the entire system ilarly, each version can be marked as ‘active’. A version can
data driven, rather than to hard-code any assumed choices. only be used in a Review after being made active. The tool’s
As with all large development projects, the primary task is algorithm selects the most recent active (highest version
to scope, define, and investigate the problem domain. In this number) for the selected school when creating Reviews.
case, the main entities in the system are; Student, Supervisor,
Chair, and Review. Chair, student, and supervisor are all 6.2 Question Types
forms of a user and only vary the role held within any given To allow users to define their questions; the tool provides
Review. some basic question types. The type frames the mode of
There are only two concretely defined entities within the response the participant can make. Each type has an asso-
system; Programs and Progress Rubrics. Programs represent ciated user interface (UI) files and handling routines built
types of activities to be monitored, in this case, degree types. into the tool. These routines handle the associated data
The full list of available Programs are: operations, controlling the display and storage of submitted
responses. The initial set of question types provided in the
• B.Sc. first release of the tool are:
• B.A.
• M.Phil. • 5-Point Likert Scale
• M.Sc.(Res) • File Upload
• M.Phil. [Part Time] • Free-form Text
• M.Sc.(Res) [Part Time] • Progress Rubric
• Ph.D. • Yes/No
• Ph.D. [Part Time] Due to the logic associated with each question type,
• Research (Generic or non-student) the tool will need to undergo further development to ac-
commodate changes to the types. This set was based on
There are four Progress Rubrics: types used in Bangor University’s previous paper system.
Section 7 presents the case studies made using the tool. The
• Achieved/Exceeded Expectations
second case study (with undergraduates) did not require
• Targeted Improvement Required
any changes to the question types.
• Significant Improvement Required
• Unacceptable
6.3 Authentication and Role Identification
Individual schools/departments are able to customize the A crucial part of any IT system is security, identifying
exact wording of each prompt. and authenticating users within a system, and granting
Initially, the tool was implemented using features offered appropriate access. The tool could be configured to use its
by Bangor’s chosen VLE, BlackboardTM . However, the task database with user credentials. However, university staff
was ultimately beyond the capabilities of this tool. Within and students already have many other credentials for sepa-
Blackboard, assignments and quizzes (types of student sub- rate systems in addition to their institutional ones. The tool
missions) only have two participants, instructor(s) and the has been configured to use the Lightweight Directory Access
student/learner. Protocol (LDAP). This protocol integrates the tool with the
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 6

institution’s Active Directory database so that users may update more closely aligned the system with the Post-
authenticate with their usual institution credentials. graduate Research Experience Survey (PRES) developed by
The tool offers auto-complete when entering student, the Higher Education Academy2 . The student review form
staff or chair details. This is achieved using the same contains the usual four questions prompting for details of
LDAP connection and the organizational groups configured their achievements, progress, challenges, and plans. The
within the Active Directory. All staff members belong to Confidential Survey asks the following, using a five-point
various defined groups, such as the exported group of staff Likert scale:
members from the campus management software “Banner”.
• ‘I know whom to speak to/contact if I have a prob-
Similar groups exist to identify students based on their
lem or concern about my studies.’
course, cohort, supervisor and degree program. This detail
• ‘Staff make an effort to understand any difficulties I
allows the tool to restrict searches for a student’s name,
face.’
surname, or username to those that are needed, based on
• ‘I understand the standard of work that is expected
the course/program under review.
of me.’
• ‘I have adequate access to the necessary research
6.4 Form Visibility equipment.’
In the original design, all participants were able to see the • ‘I have adequate access to the library facilities neces-
details about the review, the deadlines, status and stage, sary for my research.’
action plan, comments etc. They were also able to review • ‘I have suitable working space at the School.’
the forms that they had completed. This restriction means • ‘Feedback from my supervisor(s) helps me plan my
the student was not able to see their supervisor’s return. The research/work.’
chair meanwhile could view anything. Some study partici- • ‘I have the technical support I need.’
pants felt that while they were able to see the final decision, • ‘I am encouraged to think about my career develop-
they should be entitled to see the final report submitted ment needs.’
by the chair to the institution. The second version of the • ‘I feel integrated into the research community of the
tool included this suggestion, but only once the review was School.’
finalized. We did not make the remaining forms any more • ‘There are opportunities for social contact with other
open, keeping the independent nature of the submissions to research students in the College and University.’
avoid bias or tainted results. • ‘I understand the requirements of the monitoring
process.’

6.5 Security and Limited Access Lastly, the student is offered an opportunity to raise any
other concerns they wish to. The prompt reminds students
As this tool is a prototype and has a deliberately limited
that to generate a positive outcome, they must detail the
user group, the decision was taken only to make it avail-
issue exactly along with a desired course of action. There
able to the Bangor University campus. As such, no party
is no method to measure these matters or their severity
(including students with a review pending) could access it
quantitatively. The chairperson must decide if and how
from a general internet connection. The primary motivation
much weight each is afforded within the process.
was security. Every endeavor was made to ensure the tool
The supervisor and chairperson are asked to rate the
did not expose confidential information. However, a full
students on the following aspects of their development and
security review was not conducted on the software. There
capabilities displayed during the review period.
may be, however unlikely, situations where the application
could leak data or allow unauthenticated users access to • Project Management Skills
sensitive systems. During the studies, the authors were • Knowledge and Understanding
aware that this decision may make it more inconvenient for • Theoretical/Numerical Analysis Skills
participants to complete the process. As such, the evaluation • Experimental Skills
survey was changed to try to assess this impact quantita- • Written Presentation Skills
tively. • Oral Presentation Skills
• Overall Progress

7 C ASE S TUDIES There are four rubrics available for each rating. From
best-to-worst; “Met or exceeded all specified requirements.”,
Two studies were carried out; the first using a postgrad-
“Substantively met requirements, targeted improvement
uate setup, the second using forms modified to be useful
needed in some areas.”, “Failed to meet requirements
to undergraduate students. Both studies were carried out
in some areas. Significant improvement is required.” and
using staff and students in the School of Computer Science
“Failed to meet requirements for the course.”.
at Bangor University between 2015 and 2016. The number of
For each rubric prompt, there is a corresponding free-
participants in these studies is too small to draw generalized
form prompt for the rationale. The staff form also includes
conclusions. However, they do show positive results.
a five-point Likert scale question to comment on the effec-
tiveness of the supervisory relationship.
7.1 Form Configuration
2. Full details of the survey, methodology and results can be found
In both studies, the configuration of the forms was set at https://www.heacademy.ac.uk/research/surveys/postgraduate-
to match an updated internal paper review process. This research-experience-survey-pres
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 7

7.2 Study Sample 1: Ph.D. Progression 7.3 Study Sample 2: Undergraduate Dissertations

A sample of three undergraduates participated in the trial of


Our first study implemented the prototype tool for the applying the process to another level of study. The students
review of Computer Science Ph.D. students at the end of selected were a convenience sample, however, the group
A.Y. 2014/’15. This study involved ten full-time students, were all following different degree programs providing nec-
four home (UK) and six international, at varying stages of essary variation. These students were reviewed slightly less
the three-year program. The school formally reviews these formally, because in the review outcome did not decide if
students’ progress at the end of each year, with individual they were able to continue in their project/program. Instead
supervisors monitoring day-to-day efforts. As such, each of this being at the end of a year or project, the students
student only completed the review process once. The output were reviewed every three weeks. This time-frame was
from the tool was used directly to generate the return chosen to provide a continuous tracking of progress data
submitted to the doctoral school, complete with decisions. without adding too much undue burden on the students.
Two of the eight reviews flagged sufficient differences Each student completed a total of three reviews.
to require the formal meeting of the chair, supervisor, and In all nine cases, the supervisor and chairperson both
student. In the remaining six cases, the chair determined agreed to forgo the formal meeting with the student. None
that the students have made sufficient progress. All bar one of the reviews highlighted any significant differences of
student progressed to the next phase of their studies; the opinion on progress or achievement. When reviewing the
other student agreed a repeat of their study period due to returns for the entire study, both the supervisor and student
a change of topic and supervisor for reasons beyond the have identified the same areas of success and improvement.
student’s control. They were all asked a battery of questions, The students show an unexpected level of self-awareness
after their experience to evaluate the process; only 8 of the and critical analysis. Two (of the three) were able to identify
students provided complete responses. See Appendix A for causes for slow progress accurately. However, the last stu-
the full data set. dent was either less aware of the causes, or less inclined to
All students understood the process that they were being details them on the form.
asked to undertake, and most (5/8) utilized the new features The undergraduate participants were asked to complete
offered to them. All but two (6/8) liked the modified review a similar survey on their experiences - all three did so. The
process. All eight students reported they liked using a web- survey questions were altered to avoid confusion; routine
based system, and that they would not have rather used monitoring is not usually a part of their final year project in
a paper-based one. This result goes some way to proving this form. Appendix B lays out the responses to this survey
the utility of the system and that it is an improvement including the changed question texts.
over previous attempts. As discussed in the design section, Most likely a result of the unfamiliar process of contin-
students would be unable to access the prototype from uous monitoring, there was a reluctance to use previous re-
external internet connections. Three students highlighted view returns to inform the next. The three show an effective
that being unable to use the system from their home was indifference to the usage of reviews in encouraging progress
a limiting factor. Open access to a final, thoroughly tested or the reporting of progress being informative. There was a
and secured system is not expected to be an issue. 2:1 split among the students against the process, not liking
The students approached the task in a matter-of-fact it overall and not finding using the Confidential Survey
style, simply reporting successes and challenges with little worthwhile.
comment or explanation. The plans they presented were However, the response was reversed (now being favor-
similarly objective. This response may represent the stan- able) when asked about the tool, its web-based context
dard supervisory dynamic. If the student felt they had an and ease of use. This observation supports the assertion
issue requiring intervention, it would have been brought up that current students are more at ease with web-based
with their supervisor at the time. systems. The number of contributions, achievements, and
The largest benefit to the School of Computer Science challenges noted by each student in each review are small.
is in the students’ responses to the Confidential Survey. This effect is explained by the shortened review period in
These highlighted to School Management four ‘quick wins’ this experiment. Responses detailing the amount of time the
to improve the student experience. These were related to form required shows a similar effect.
journal access, perceived IT issues, and social integration. In this sample, there was no stand-out benefit to the
The School was able to make more information and changes School. However, as there were no causes for concern voiced
within the department to directly address the reported in either type of student forms; we can conclude the process
concerns almost immediately. However, the issue of so- is fit for purpose at the undergraduate level. The study had
cial integration with a student body that has undergone a dual-mission. As well as testing the tool/process; it was
a drastic shift in ethnic balance over recent years will also designed as a verification of supervisory skills as the
require longer term consideration and action. The previous academic concerned was mentoring dissertation students
system/process did not include any way to capture any of for the first time. The result that no student felt that there
this information. Any comments would have been made di- were issues with the supervisory relationship and that the
rectly to supervisors and may not have been communicated chair felt that both sets of submissions were in keeping with
effectively to higher levels of management. each other indicates that the supervision was effective.
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 8

7.4 Discussion After further review, Bangor University has decided to


There is, in both samples, a widespread acceptance of the roll-out the system to all postgraduate programs without
process with only one participant (of the 11 completing the modification. Individual schools will still be able to cus-
survey) indifferent to the change. No student expressed the tomize forms as they see fit, but the basic process/work-
view that they would prefer to use a paper-based system, flow will remain intact. The authors take this as final
despite just over half (6 of 11) not using the extra features confirmation that the system is sound. Students will begin
that a web-based system offers them, namely uploading and utilizing the institution-wide version from A.Y. 2016/17.
attaching evidence.
Both surveys share an interesting finding; the authors 8 C ONCLUSION
assumed that students would utilize previous data to inform
their current responses. Only two students reported doing We have presented a viable process and tool for handling
this. This observation may be a side-effect of the change on-going monitoring of academic projects. The prototype
to the on-line system. Ready access, through the system, to has been successfully deployed in two case studies, looking
the previous data may well change student practice in the at both undergraduate and postgraduate work. The use of a
future. familiar, web-based environment has proven to be popular
As can be expected, there is a distinct difference between with the students involved in our case studies. Supervisors
responses of the utility of the review process itself between and administrators have similarly approved, citing a reduc-
postgraduate and undergraduate students. Postgraduate tion in the manual processing leaving more time to focus on
students accept and understand the necessity of regular their students.
monitoring as part of their studies. By contrast, undergrad- The Form Versions concept allows the tool to keep pace
uates appear to view the process as a nuisance as it is not with institutional needs over time without re-development.
core to the completion of their dissertation projects. If the The tool earmarks each review with the version of the
process were to be extended to the entire cohort, we expect form used, enabling administrators to recover historical
this perception to improve as there will be less difference be- entries/reviews with ease. Similarly, the versions allow
tween individual student experiences. Completing reviews department or project group level variance. This flexibility
would become a standard feature rather than the exception. alleviates the burden, common to such institution-wide ef-
So far, the student experience has been the focus as they forts, of finding a one-size-fits-all form. Also, the flexibility
are the primary user. However, the utility and experience permits the process to be applied to other periodic monitor-
of supervisors and chairs must not be discounted. The ing tasks, assuming there are the same three parties and the
collection method for their experiences was informal, asking same types of forms in use.
for honest opinions and how the replacement has affected We set out to amend a under-performing process mired
their practice with the students. in paperwork, with an efficient one supported by a web-
“I found the new system to be a joy to use compared based tool. We have shown, in two instances, our pro-
to the existing paper-based approach. Having everything posed solution has increased student satisfaction, reduced
managed automatically also meant that more time could administrative burden, all the while increasing utility and
be spent concentrating on providing better feedback to stu- transparency. The system is being made available to all
dents rather than spending more time on administration.” Schools at Bangor University. Service departments are also
(Dr. William J. Teahan, Director of Postgraduate Studies, inquiring about adapting the tool to their internal review
School of Computer Science - Supervisor and Chair) processes. We have therefore concluded that the project has
“The progress of every research student is formally been a success and delivered upon its stated aims.
reviewed on an annual basis up until the successful submis-
sion of the thesis. The system was developed for tracking ACKNOWLEDGMENTS
individual students’ academic progress, so that school and
The authors would like to thank the undergraduate and
mentors are aware when a student is struggling academi-
postgraduate students at Bangor University for participat-
cally at early stages of their studies. The system was easy
ing in the trial of this tool, and for providing invaluable
to use by both the supervisor and the student. In addition
feedback on the process. Also, the authors thank the contacts
a user manual is provided describing the various form op-
at other institutions for providing invaluable access to their
tions. The form summarizes the student’s research progress
processes and regulations.
and highlights at early stages any supervisory issues.” (Dr.
Sa’ad P. Mansoor, Head of School, School of Computer
Science - Supervisor and Chair)
This system does not only measure the student’s per-
formance over time but it can also be used to measure
the effectiveness of Continuous Professional Development
(CPD) for supervising staff members over time too. The
addition of the chairperson, and their objectivity means
that they are ideally placed to make a judgment on how
well their other participants’ view correlate with each other.
This may also be used as part of the training provided to
new supervisors when handling their first project/research
student.
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 9

R EFERENCES [25] J. C. Cassady and R. E. Johnson, “Cognitive test anxiety and


academic performance,” Contemporary Educ. Psychology, vol. 27,
[1] N. Safer and S. Fleischman, “Research matters/how student no. 2, pp. 270–295, 2002.
progress monitoring improves instruction,” Educ. Leadership, [26] M. S. Chapell, Z. B. Blanding, M. E. Silverstein, M. Takahashi,
vol. 62, no. 5, pp. 81–83, 2005. B. Newman, A. Gubi, and N. McCann, “Test anxiety and academic
[2] P. C. Taylor and V. Dawson, Critical reflections on a problematic performance in undergraduate and graduate students.” J. Educ.
student–supervisor relationship. Lawrence Erlbaum, 1998, pp. 105– Psychology, vol. 97, no. 2, p. 268, 2005.
127. [27] M. Freeman and J. McKenzie, “Spark, a confidential web–based
[3] A. Yeatman et al., “Making supervision relationships accountable: template for self and peer assessment of student teamwork: bene-
graduate student logs,” Australian Universities’ Review, vol. 38, fits of evaluating across different subjects,” BRIT J Educ. Technology,
no. 2, p. 9, 1995. vol. 33, no. 5, pp. 551–569, 2002.
[4] R. Mazza and V. Dimitrova, “Coursevis: A graphical student [28] C. McLoughlin and J. Luca, “A learner–centred approach to devel-
monitoring tool for supporting instructors in web-based distance oping team skills through web–based learning and assessment,”
courses,” INT J HUM-COMPUT INT, vol. 65, no. 2, pp. 125–139, BRIT J Educ. Technology, vol. 33, no. 5, pp. 571–582, 2002.
2007. [29] E. L. Meyen, R. J. Aust, Y. N. Bui, and R. Isaacson, “Assessing and
[5] R. Mazza and C. Milani, “Gismo: a graphical interactive student monitoring student progress in an e-learning personnel prepara-
monitoring tool for course management systems,” in Int. Conf. tion environment,” Teacher Educ. and Special Educ., vol. 25, no. 2,
Technology Enhanced Learning, 2004, pp. 1–8. pp. 187–198, 2002.
[6] S. L. Deno, A. L. Reschly, E. S. Lembke, D. Magnusson, S. A. Cal- [30] A. Margaryan, A. Littlejohn, and G. Vojt, “Are digital natives a
lender, H. Windram, and N. Stachel, “Developing a school-wide myth or reality? university students use of digital technologies,”
progress-monitoring system,” Psychology in the Schools, vol. 46, Comput. & Educ., vol. 56, no. 2, pp. 429 – 440, 2011.
no. 1, pp. 44–55, 2009. [31] R. Higgins, P. Hartley, and A. Skelton, “The conscientious con-
[7] K. F. Vlug, “Because every pupil counts: the success of the pupil sumer: Reconsidering the role of assessment feedback in student
monitoring system in the netherlands,” Educ. and Inf. Technologies, learning,” Studies in higher educ., vol. 27, no. 1, pp. 53–64, 2002.
vol. 2, no. 4, pp. 287–306, 1997. [32] C. Juwah, D. Macfarlane-Dick, B. Matthew, D. Nicol, D. Ross, and
[8] H. Reinders, “Big brother is helping you: Supporting self-access B. Smith, “Enhancing student learning through effective formative
language learning with a student monitoring system,” System, feedback,” 2004.
vol. 35, no. 1, pp. 93–111, 2007. [33] F. B. King and D. LaRocco, “E-journaling: A strategy to support
[9] R. Mazza and V. Dimitrova, “Visualising student tracking data to student reflection and understanding,” Current Issues in Educ.,
support instructors in web-based distance education,” in Proc. 13th vol. 9, 2006.
Int. Conf. World Wide Web. ACM, 2004, pp. 154–161. [34] E. L. Corley, “A qualitative study of student perceptions regarding
[10] J. C. Roberts, C. Headleand, D. Perkins, and P. D. Ritsos, “Personal electronic journaling.” in Conf. Mid-Western Educ. Research Assoc.
Visualization for Learning,” in Personal Visualization Workshop, ERIC, 2000.
IEEE Conf. Visualization, 2015. [35] P. Davies, “Using student reflective self-assessment for award-
[11] M. May, S. George, and P. Prévot, “Travis to enhance online tu- ing degree classifications,” Innovations in Educ. and Teaching Int.,
toring and learning activities: Real-time visualization of students vol. 39, no. 4, pp. 307–319, 2002.
tracking data,” Interactive Technology and Smart Educ., vol. 8, no. 1, [36] J. Cornelius-White, “Learner-centered teacher-student relation-
pp. 52–69, 2011. ships are effective: A meta-analysis,” Review Educ. Research, vol. 77,
no. 1, pp. 113–143, 2007.
[12] R. Mazza and C. Milani, “Exploring usage analysis in learning
[37] D. Fillingham, “A comparison of digital and handwritten signa-
systems: Gaining insights from visualisations,” in Usage Analysis
tures,” Ethics and Law on the Electron. Frontier, vol. 6, 1997.
in Learning Syst. Workshop, 12th Int. Conf. AI in educ., 2005, pp.
65–72.
[13] D. Leony, A. Pardo, L. de la Fuente Valentı́n, D. S. de Castro, and
C. D. Kloos, “Glass: a learning analytics visualization tool,” in
Proc. 2nd Int. Conf. Learning Analytics and knowledge. ACM, 2012,
pp. 162–163.
[14] J. Larmer and J. R. Mergendoller, “Seven essentials for project- Cameron C. Gray Cameron is currently a Ph.D. student in the School
based learning,” Educ. leadership, vol. 68, no. 1, pp. 34–37, 2010. of Computer Science at Bangor University. His research areas include
[15] A. L. Steenkamp, “A standards-based approach to team-based graph-based network security and applied visualization. He is currently
student projects in an information technology curriculum.” in the leader for two undergraduate modules, supervises undergraduate
Proc. IAIM Conf. Informatics Educ. Research. ERIC, 2002. and M.Sc. projects and is a teaching assistant on several more courses.
[16] J. Chao, “Student project collaboration using wikis,” in Conf.
Software Eng. Educ. & Training. IEEE, 2007, pp. 255–261.
[17] G. Gweon, S. Jun, J. Lee, S. Finger, and C. P. Rosé, “A framework
for assessment of student project groups on-line and off-line,” in
Analyzing Interactions in CSCL. Springer, 2011, pp. 293–317.
[18] A. A. Brandyberry and S. A. Bakke, “Mitigating negative behav- Dr. Dave Perkins Dr. Dave Perkins Dave is a lecturer in Computer
iors in student project teams: An information technology solu- Science at Bangor University. He leads several under/post graduate
tion,” J. Inf. Systs. Educ., vol. 17, no. 2, p. 195, 2006. modules within Computer Science. He also leads initiatives within the
[19] S. Drummond and C. Boldyreff, “The development and trial of Centre for Enhancement of Learning and Teaching (CELT) including
segworld: a virtual environment for software engineering student Curriculum Development, Technology and Innovation in Teaching and
group work,” in Conf. Software Eng. Educ. & Training. IEEE, 2000, The Technology in Teaching CPD Program. Dave holds a Postgraduate
pp. 87–97. Certificate in Education and is also a Senior Fellow of the Higher
[20] P. Brusilovsky, “Adaptive hypermedia: From intelligent tutoring Education Academy.
systems to web-based education,” in Intelligent Tutoring Systems.
Springer, 2000, pp. 1–7.
[21] A. T. Corbett, K. R. Koedinger, and J. R. Anderson, “Intelligent
tutoring systems,” Handbook of humancomputer interaction, pp. 849–
874, 1997.
[22] N. T. Heffernan and K. R. Koedinger, “An intelligent tutoring Ludmila I. Kuncheva Ludmila (Lucy) Kuncheva is a Professor of
system incorporating a model of an experienced human tutor,” Computer Science at Bangor University, UK. She has published two
in Intelligent Tutoring Systems. Springer, 2002, pp. 596–608. monographs and over 200 research papers. She is a Fellow of the
[23] J. A. Walonoski and N. T. Heffernan, “Detection and analysis International Association of Pattern Recognition and a Fellow of the
of off-task gaming behavior in intelligent tutoring systems,” in Learned Society of Wales. Lucy has served as an Associate Editor for
Intelligent Tutoring Systems. Springer, 2006, pp. 382–391. IEEE Transactions on Fuzzy Systems and IEEE Transactions on Pattern
[24] D. A. Kolb, “Learning styles and disciplinary differences,” The Analysis and Machine Intelligence.
modern American college, vol. 1, pp. 232–255, 1981.
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 10

A PPENDIX A A PPENDIX B
P OSTGRADUATE S TUDENT S URVEY DATA U NDERGRADUATE S TUDENT S URVEY DATA
The responses have been split by question type, there are The results have been split by question type, in the same
eight (8) respondents in total. Tables 1 to 3 show categorical way as Appendix A. Tables 5 to 7 show categorical re-
responses. Table 4 contains the Yes/No responses. Table 9 sponses. Table 8 contains the Yes/No responses. Table 10
(page 11) contains all Likert-scale questions. (page 11) contains all Likert-scale questions. There are three
(3) respondents in total.
TABLE 1
Postgraduate Time Spent Responses TABLE 5
Undergraduate Time Spent Responses
Some
Question Little (<30 Lots
(>30min, Some
min) (>2hrs) Little (<30 Lots
<2hrs) Question (>30min,
min) (>2hrs)
1: How much time <2hrs)
3 (37.5%) 4 (50%) 1 (12.5%)
did you spend com- 1: How much time 3 (100%) 0 0
pleting each review did you spend com-
submission? pleting each review
submission?

TABLE 2
Postgraduate Achievement Responses TABLE 6
Undergraduate Achievement Responses

Question 5 or
None 1-2 3-4
more 5 or
Question None 1-2 3-4
3 1 more
6: How many achieve- 0 4 (50%)
ments did you list on (37.5%) (12.5%) 2 1
6: How many achieve- 0 0
the review? ments did you list on (66.6%) (33.3%)
the review?

TABLE 3
Postgraduate Challenges Faced Responses TABLE 7
Undergraduate Challenges Faced Responses
3 or
Question None 1 2
more 3 or
1 3 Question None 1 2
10: How many chal- 2 (25%) 2 (25%) more
(12.5%) (37.5%) 3
lenges did you iden- 10: How many chal- 0 0 0
tify on the review? (100%)
lenges did you iden-
tify on the review?

TABLE 4
Postgraduate Yes/No Responses TABLE 8
Undergraduate Yes/No Responses
Question Yes No
Question Yes No
14: Were there any questions that you 0 8 (100%)
were unable to answer? 14: Were there any questions that you 0 3 (100%)
15: Were there any questions that you were unable to answer?
1 (12.5%) 7 (87.5%)
felt were inappropriate to ask? 15: Were there any questions that you 0 3 (100%)
17: Did the website break or show felt were inappropriate to ask?
0 8 (100%)
any errors while filling out your re- 17: Did the website break or show 3 (100%) 0
views? any errors while filling out your re-
21: I uploaded evidence to the web- views?
5 (62.5%) 3 (37.5%)
site to support the points in my re- 21: I uploaded evidence to the web- 0 3 (100%)
view. site to support the points in my re-
24: Would you have rather used a view.
0 8 (100%)
document-based or paper system in- 24: Would you have rather used a 0 3 (100%)
stead of the website? document-based or paper system in-
stead of the website?
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 11

TABLE 9
Postgraduate Likert-Scale Responses

Neither
Question Strongly Mildly Mildly Strongly
Agree Nor
Disagree (1) Disagree (2) Agree (4) Agree (5)
Disagree (3)
2: The reviews helped me gauge my progress. 0 2 (25%) 1 (12.5%) 5 (62.5%) 0
3: Reporting my progress every year helped me keep track of my 0 0 6 (75%) 2 (25%) 0
overall progress.
4: I was motivated to make progress each time because I knew I 0 2 (25%) 3 (37.5%) 3 (37.5%) 0
was being asked to fill in a review.
5: I think I made progress over this review period. 0 0 3 (37.5%) 5 (62.5%) 0
7: I found writing a list of what I had achieved during each period 0 0 4 (50%) 3 (37.5%) 1 (12.5%)
useful.
8: I liked the review process, for whatever reasons. 0 2 (25%) 3 (37.5%) 2 (25%) 1 (12.5%)
9: I found having a confidential survey to discuss the relationship 0 2 (25%) 3 (37.5%) 2 (25%) 1 (12.5%)
with my supervisor to be useful.
11: I found the questions asked difficult to answer. 1 (12.5%) 3 (37.5%) 2 (25%) 2 (25%) 0
12: I didn’t like answering the questions asked of me. 1 (12.5%) 2 (25%) 4 (50%) 1 (12.5%) 0
13: I used the information from previous reviews to complete this 2 (25%) 2 (25%) 1 (12.5%) 2 (25%) 0
one.
19: I found not being able to use the website from home limiting. 0 3 (37.5%) 0 1 (12.5%) 4 (50%)
20: I found the website easy to use. 0 0 2 (25%) 3 (37.5%) 3 (37.5%)
21: I liked the range of options/answers that I was allowed to 0 0 3 (37.5%) 3 (37.5%) 2 (25%)
give.
22: I liked using a web-based system for the monitoring process. 0 0 0 5 (62.5%) 3 (37.5%)
23: Using a website made answering the review and confidential 0 0 0 5 (62.5%) 3 (37.5%)
survey questions easy.

TABLE 10
Undergraduate Likert-Scale Responses

Neither
Question Strongly Mildly Mildly Strongly
Agree Nor
Disagree (1) Disagree (2) Agree (4) Agree (5)
Disagree (3)
2: The reviews helped me gauge my progress. 0 1 (33.3%) 1 (33.3%) 1 (33.3%) 0
3: Reporting my progress every 3 weeks helped me keep track of 0 1 (33.3%) 1 (33.3%) 1 (33.3%) 0
my overall progress.
4: I was motivated to make progress each time because I knew I 1 (33.3%) 1 (33.3%) 0 0 1 (33.3%)
was being asked to fill in a review.
5: I think I made progress between each review. 0 0 1 (33.3%) 2 (66.6%) 0
7: I found writing a list of what I had achieved during each period 0 1 (33.3%) 2 (66.6%) 0 0
useful.
8: I liked the review process, for whatever reasons. 0 2 (66.6%) 0 1 (33.3%) 0
9: I found having a confidential survey to discuss the relationship 0 2 (66.6%) 0 1 (33.3%) 0
with my supervisor to be useful.
11: I found the questions asked difficult to answer. 0 1 (33.3%) 1 (33.3%) 1 (33.3%) 0
12: I didn’t like answering the questions asked of me. 0 3 (100%) 0) 0 0
13: I used the information from previous reviews to complete this 1 (33.3%) 1 (33.3%) 1 (33.3%) 0 0
one.
19: I found not being able to use the website from home limiting. 0 1 (33.3%) 2 (66.6%) 0 0
20: I found the website easy to use. 0 0 1 (33.3%) 2 (66.6%) 0
21: I liked the range of options/answers that I was allowed to 0 0 2 (66.6%) 1 (33.3%) 0
give.
22: I liked using a web-based system for the monitoring process. 0 0 1 (33.3%) 0 2 (66.6%)
23: Using a website made answering the review and confidential 0 0 1 (33.3%) 0 2 (66.6%)
survey questions easy.

You might also like