Web Based Student Progress Monitoring System
Web Based Student Progress Monitoring System
Abstract—Undergraduate and postgraduate students take part in project work designed with broad scopes. These projects encourage
the student to showcase their abilities along with time and project management skills. Timescales for these types of projects are
necessarily longer, meaning that keen supervision can be required to ensure efforts are appropriately applied. We present a web-
based pedagogical tool and workflow to support effective monitoring of these students within the UK and Western European Higher
Education systems. This workflow includes an oversight capability to handle the objections and unavoidable subjectivity that can arise.
Ultimately, the process and tool are designed to aid in allowing the students to achieve their potential, improving success rates when
measured against the learning outcomes. The authors believe that the tool and workflow are abstracted enough to be used in any
project-based review process.
1 I NTRODUCTION
the important detail is lost as the virtual environment cannot and the committee’s recommendation (progress, fail or
capture it. For example, the number of papers, websites or downgrade to a lower qualification). After being agreed and
other electronic resources used could easily be recorded. It signed by the members of the committee and the student,
is not possible to track offline resources; nor whether the the report is submitted to the institution. The tone and
student read, or understood the material(s). tenor of this return could be colored (either way) by the
Research into similar systems, dubbed Intelligent Tu- relationship [2]. Most versions of the forms asked for an
toring Systems, was in vogue during the late 1990s to impression of progress but did not require evidence to be
mid-2000s. These systems attempted to emulate the re- submitted to justify those perceptions. Therefore, no proof
sponse and adaptive qualities of live teachers [20], [21], of a truly objective assessment of achievement or skill can
[22]. While this form of a system may well be successful be shown. As a result, the report could be prepared in a one-
in courses with narrowly defined curricula; the range of sided manner. If or when disagreements occur the student
responses required to handle project work is simply too may feel they have little option than to avail themselves of
vast to model in one of these systems. Another drawback the formal appeals process further fuelling a deterioration
to a fully automated/electronic system is that students of the relationship.
can learn to game the system [23]. A corollary is where
students apply a strategic learning style [24] to pass a
4 B ENEFITS OF R EVISING THE M ONITORING
module/test/examination. Without the ‘human in the loop’,
this gaming can go unchecked. In a project context, the P ROCESS
result is weakened oversight, and arguably more aspects Any progress monitoring will necessarily include an ele-
that could go awry despite the application of technology ment of evaluation, of the student and deliverables. It is
and appropriate reported progress. natural in these situations that students feel some level of
performance anxiety [25], [26]. Using an environment that is
familiar to the student can help lessen that anxiety. Higher
3 T HE W ESTERN E UROPEAN R ESEARCH Education institutions routinely use Virtual Learning Envi-
M ONITORING P ROCESS ronments (VLEs) such as Moodle, Blackboard LearnTM and
This section most readily applies to Ph.D. and M.Sc. (Res) similar. Utilizing these systems for assessment is not a new
programs. There will be slight variations by institutional and concept [27], [28], [29]. Rarely are such systems used for con-
subject. Elements can be transposed into undergraduate or taught trol or assessment of research and/or project activities. Such
M.Sc. theses. systems are used as a repository for marks and feedback
to the student as well as for plagiarism detection. As more
Institutions in the UK and Western Europe share a sim- digital natives embark on their higher education careers, the
ilar annual, or semi- annual review process for monitoring use of technology will be far more accepted than ever before
progress and setting goals. We base this assertion on per- [30]. This familiarity may help lessen the anxiety associated
sonal communications between the authors and academics with the progress review when students are compiling their
from University of Cagliari, Italy; Delft University of Tech- responses.
nology, Netherlands; Technical University of Sofia, Bulgaria. In these longer term projects, academics tend toward
The United Kingdom perspective1 comes from the authors’ using a report such as a dissertation or a thesis as the
first-hand knowledge of the Bangor University process. primary deliverable and use this for assessment. The think-
ing behind this choice is to incorporate transferable and
Each period, either semi-annually or annually as defined
communication skills into the evaluation of student per-
by local policy, the student is required to prepare a report
formance. This practice is reasonable and filled with good
detailing activities in that period. The prompts provided,
intentions. It does mean, however, the assessment becomes
on forms where used, are largely the same: ‘what have
entirely summative. The workflow and tool presented are
you achieved?’, ‘what progress have you made?’, ‘what
not arguing to remove this form, but to supplement it with
challenges have you faced?’, and ‘what is the plan for the
formal, regular and transparent monitoring meetings.
next period?’. These questions suggest a reflective approach,
Any monitoring tool/process need not solely report
more than a coldly analytic one. Once the report has been
students’ progress; it can be another channel for feedback
prepared, the student will meet formally with their supervi-
to benefit both sides of the relationship. A previous study
sory committee. The student’s immediate supervisor(s), any
spanning three years found that students do value their
co-supervisors, and any other advisers assisting with the
tutors’ feedback but will only spend a limited time on it [31].
endeavor have a seat on this committee. The student may
In a project setting this translates into requiring more, but
be asked to present their report orally, or it may take the
smaller, ‘chunks’ of feedback to digest at regular intervals.
form of a Q&A session.
A methodology using this suggestion can help support the
After this meeting, the supervisory committee will pre-
existing supervisory process with plans and guidance at
pare a written report detailing the student’s performance
every stage. Research commissioned by the Higher Educa-
1. The substantive portions of the UK view are also tion Academy [32] highlights the necessity of feedback to
corroborated by various policy documents, such as http: nurture and encourage a partnership approach to release a
//www.ncl.ac.uk/fms/postgrad/documentation/documents/ student’s potential. Chao [16] has found that electronic tools
Resstudenthandbook2014-15.pdf §4, http://www.abdn.ac.uk/ can be particularly effective in this regard.
cops/graduate/assessment-process-255.php, and http://www.
gcu.ac.uk/graduateschool/postgraduatestudy/phdstudyatgcu/ An unintended, but highly desirable, side-effect of re-
researchstudentprogressionforms/ quiring regular progress reports is that students will become
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 3
Student Student
Completes Completes
Review Form Survey
Chair Reviews Is Meeting
No
Review Created Required
Submissions
Supervisor
Completes
Review Form Yes
Student and
Supervisor Agree Comment Stage
Outcome/Actions
Optional
Override
Chair Signs-off
Review
Fig. 1. Flowchart showing the proposed monitoring process. See Section 5 for full descriptions of each stage. The ‘meeting required’ decision is
made in concert with the participants of the review rather than by any one party.
more reflective about their experiences. Previous investiga- To make the process as straightforward as possible, we
tions [33], [34] corroborate this assertion. As a consequence, have implemented the system using familiar web-based
students are more aware of their progress and accomplish- concepts. The staff and students should already be familiar
ments as well as where issues reside. As the student is with these constructs as they are becoming more ubiquitous
already aware of potential failings, this can lessen the impact every day. Using these constructs means there is not another
of some negative feedback. However, the accepted devia- set of controls, concepts, and applications to learn. This
tions regarding self-assessment must be taken into account familiarity should cause less anxiety, common with new sys-
[35]. The critical item from the cited list is that stronger stu- tems, helping the process bed in rather than users pushing
dents are harsher on themselves whereas weaker students, against it. However, there will always be segments of the
typically, overrate their work. student body that find using a web environment difficult.
Within Bangor University, previous solutions have been This effect may be particularly acute for mature students or
unable to increase transparency. While the student con- those from lower socio-economic backgrounds who are not
cerned had input, the report from their supervisor was the exposed to technology as much.
final word. A meta-analysis [36] of 119 individual stud-
ies has found that the teacher-student relationship is one
of the highest-impact elements that a teacher can control 5.1 Process Description
driving student achievement. To be transparent, a method 5.1.1 Review Creation and Form Submission
that allows two-way constructive criticism is required. Such Originally the authors envisaged a member of administra-
a system is necessary for ensuring continued viability of tive or support staff performing the initial creation of the
longer-term projects. review for the appropriate students. However, there is no
reason that each students’ primary supervisor could not
5 T HE P ROPOSED TOOL AND W ORKFLOW carry out this task instead. The creation step simply involves
The Project Progress Monitoring System is designed to be selecting the program, phase of study, the form submission
a non-intrusive method bringing transparency and outside deadline and a provisional date for the review meeting
corroboration to projects at any level. Figure 1 shows, (assuming it is required). There are no restrictions on when
graphically, the proposed workflow; we explain each step reviews may be scheduled. Most institutions’ regulations
in greater detail in this section. stipulate that a review must occur at least once a year. By
The most obvious deviation from existing processes accommodating more than one Review during any phase of
is that we recommend adding a chairperson. This partici- study supervisors are given a level of flexibility to schedule
pant is expected to be objective and ideally impartial. The reviews according to need, events, and ability.
chairperson acts as a moderator between the student and Once created/scheduled, the student is asked to com-
their supervisor. They hold the casting vote on whether plete two forms each with a different purpose. The first
the student has demonstrated sufficient progress or not. form provides the student the opportunity to catalog
Any candidate for chairperson would be part of the same their achievements, progress made, challenges faced, and
field of study, but not directly involved with the work. In goals for the next period (however long that may be).
smaller departments/schools/groups or those with a close- The student may (optionally) upload any supporting ev-
knit community, suitable candidates may be difficult to find. idence they deem suitable. The second form named the
In these cases, we would expect the head of the community ‘Confidential Survey’, offers a quantitative and qualita-
to make an executive decision in appointing a chair. tive view into the student experience. It is intended
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 4
to be a ‘safe space’ for the student to make observa- 5.1.3 Action Plan/Comments
tions and, if necessary, concerns about their situation. For The output of any review is an Action Plan and formal deci-
the school/department/college/institution, these responses sion on progression. The Action Plan, as its name suggests,
provide invaluable feedback when examined in an aggre- is the formally agreed next steps. The suggested workflow
gate context for example, on a cohort, program or school provides for four categories of person that can have actions
level. In some regards, ‘Confidential Survey’ is a misnomer. assigned: student, supervisor, school and other. These four
It is only intended to be confidential from the primary have been selected in recognition of the fact that despite
supervisor so that the student has a mechanism to report participants’ best efforts, other external factors may result
critically on the working relationship. The survey is not in a poor outcome in a review or of the project. Examples of
meant to be confidential from the institution or chairperson. actions on a school could revolve around improved access
The authors would expect anyone appointed as a chair- to resources, library materials or clarification on process,
person to abide by professional ethics in not revealing the regulations or assessment criteria. The ‘other’ actor is in-
contents of the survey unless there is anything of extreme cluded for situations where there may be institution-wide
concern. changes or clarification required, where projects include an
The supervisor is asked to complete a different form, industrial, professional or clinical partner or any other third-
with prompts tailored to the evaluation of the stu- party needing to complete actions to ensure the ongoing
dent’s performance. Evaluations use a four-point rubric; success of the project.
met/exceeded expectations, some improvement needed, Rarely, in assessment matters, are things completely
major improvement required and unacceptable. These black and white. Research and other longer-running project
rubrics are intended to be generic; each school/department work present a particular challenge as multiple skills are
is then left to produce a form of words describing what judged at once. To balance the power assessors wield in
each element means within the realm of their programs. these situations, any of the participants have the ability to
Alongside each rating prompt, the supervisor must provide make comments on the process. The system stores these
a rationale for the selection. Without this provision, the remarks alongside the submissions and decisions; allowing
chairperson would be presented with a verdict but no evi- objections and corrections to be noted. The authors envisage
dence. We expect that students would use the upload feature situations where the participants disagree on the scope,
to balance out any counter claims. Neither the student nor wording, or presence of items on the action plan. It remains
the supervisor is permitted to access the initial submissions within the chairperson’s option to adjust the Action Plan or
of the other as the returns are intended to be independent. formal decision based on this new information.
and the system notifies the chairperson that the review is 6.1 Form Versions
ready for sign-off. Once a review has been signed-off, no Every Review will contain the three forms, but these must
further changes or comments are permitted. However, if vary by School and over time. The concept of Form Versions
one participant is proving to be particularly intransigent, are used to accommodate this variance; a simplified Entity
the chairperson is authorized to force the issue, overriding Relationship Diagram illustrates this in Figure 2.
the objections. This ability is required to diffuse a situation
where a student refuses to give approval, most likely as it
would result in a poor outcome, to prolong the process. If Form
School Question
this occurs, a comment is added to the review attesting to Version
the override, the review is signed-off, and all participants
notified.
Review
6 TOOL D ESIGN
Fig. 2. Entity Relationship Diagram showing Form Version relations.
Most, if not all, interfaces with other systems and technologies
have been configured to use Bangor University’s IT infrastructure. Using the versions concept; the system can recreate the
However, in most cases it would be trivial to adjust for another form as was in use at the time - irrespective of how much
institution assuming standards compliance. time has passed or how many subsequent versions there are.
Administrators may mark versions as ‘available for copy’,
The largest consideration in the design of a project allowing others to base their form on this shared or reference
monitoring system is the management of the sheer quantity copy. The shared versions are then made available to all
of differences among the various projects and levels. The departments/schools to base their on form versions on. Sim-
simplest solution to this issue is to make the entire system ilarly, each version can be marked as ‘active’. A version can
data driven, rather than to hard-code any assumed choices. only be used in a Review after being made active. The tool’s
As with all large development projects, the primary task is algorithm selects the most recent active (highest version
to scope, define, and investigate the problem domain. In this number) for the selected school when creating Reviews.
case, the main entities in the system are; Student, Supervisor,
Chair, and Review. Chair, student, and supervisor are all 6.2 Question Types
forms of a user and only vary the role held within any given To allow users to define their questions; the tool provides
Review. some basic question types. The type frames the mode of
There are only two concretely defined entities within the response the participant can make. Each type has an asso-
system; Programs and Progress Rubrics. Programs represent ciated user interface (UI) files and handling routines built
types of activities to be monitored, in this case, degree types. into the tool. These routines handle the associated data
The full list of available Programs are: operations, controlling the display and storage of submitted
responses. The initial set of question types provided in the
• B.Sc. first release of the tool are:
• B.A.
• M.Phil. • 5-Point Likert Scale
• M.Sc.(Res) • File Upload
• M.Phil. [Part Time] • Free-form Text
• M.Sc.(Res) [Part Time] • Progress Rubric
• Ph.D. • Yes/No
• Ph.D. [Part Time] Due to the logic associated with each question type,
• Research (Generic or non-student) the tool will need to undergo further development to ac-
commodate changes to the types. This set was based on
There are four Progress Rubrics: types used in Bangor University’s previous paper system.
Section 7 presents the case studies made using the tool. The
• Achieved/Exceeded Expectations
second case study (with undergraduates) did not require
• Targeted Improvement Required
any changes to the question types.
• Significant Improvement Required
• Unacceptable
6.3 Authentication and Role Identification
Individual schools/departments are able to customize the A crucial part of any IT system is security, identifying
exact wording of each prompt. and authenticating users within a system, and granting
Initially, the tool was implemented using features offered appropriate access. The tool could be configured to use its
by Bangor’s chosen VLE, BlackboardTM . However, the task database with user credentials. However, university staff
was ultimately beyond the capabilities of this tool. Within and students already have many other credentials for sepa-
Blackboard, assignments and quizzes (types of student sub- rate systems in addition to their institutional ones. The tool
missions) only have two participants, instructor(s) and the has been configured to use the Lightweight Directory Access
student/learner. Protocol (LDAP). This protocol integrates the tool with the
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 6
institution’s Active Directory database so that users may update more closely aligned the system with the Post-
authenticate with their usual institution credentials. graduate Research Experience Survey (PRES) developed by
The tool offers auto-complete when entering student, the Higher Education Academy2 . The student review form
staff or chair details. This is achieved using the same contains the usual four questions prompting for details of
LDAP connection and the organizational groups configured their achievements, progress, challenges, and plans. The
within the Active Directory. All staff members belong to Confidential Survey asks the following, using a five-point
various defined groups, such as the exported group of staff Likert scale:
members from the campus management software “Banner”.
• ‘I know whom to speak to/contact if I have a prob-
Similar groups exist to identify students based on their
lem or concern about my studies.’
course, cohort, supervisor and degree program. This detail
• ‘Staff make an effort to understand any difficulties I
allows the tool to restrict searches for a student’s name,
face.’
surname, or username to those that are needed, based on
• ‘I understand the standard of work that is expected
the course/program under review.
of me.’
• ‘I have adequate access to the necessary research
6.4 Form Visibility equipment.’
In the original design, all participants were able to see the • ‘I have adequate access to the library facilities neces-
details about the review, the deadlines, status and stage, sary for my research.’
action plan, comments etc. They were also able to review • ‘I have suitable working space at the School.’
the forms that they had completed. This restriction means • ‘Feedback from my supervisor(s) helps me plan my
the student was not able to see their supervisor’s return. The research/work.’
chair meanwhile could view anything. Some study partici- • ‘I have the technical support I need.’
pants felt that while they were able to see the final decision, • ‘I am encouraged to think about my career develop-
they should be entitled to see the final report submitted ment needs.’
by the chair to the institution. The second version of the • ‘I feel integrated into the research community of the
tool included this suggestion, but only once the review was School.’
finalized. We did not make the remaining forms any more • ‘There are opportunities for social contact with other
open, keeping the independent nature of the submissions to research students in the College and University.’
avoid bias or tainted results. • ‘I understand the requirements of the monitoring
process.’
6.5 Security and Limited Access Lastly, the student is offered an opportunity to raise any
other concerns they wish to. The prompt reminds students
As this tool is a prototype and has a deliberately limited
that to generate a positive outcome, they must detail the
user group, the decision was taken only to make it avail-
issue exactly along with a desired course of action. There
able to the Bangor University campus. As such, no party
is no method to measure these matters or their severity
(including students with a review pending) could access it
quantitatively. The chairperson must decide if and how
from a general internet connection. The primary motivation
much weight each is afforded within the process.
was security. Every endeavor was made to ensure the tool
The supervisor and chairperson are asked to rate the
did not expose confidential information. However, a full
students on the following aspects of their development and
security review was not conducted on the software. There
capabilities displayed during the review period.
may be, however unlikely, situations where the application
could leak data or allow unauthenticated users access to • Project Management Skills
sensitive systems. During the studies, the authors were • Knowledge and Understanding
aware that this decision may make it more inconvenient for • Theoretical/Numerical Analysis Skills
participants to complete the process. As such, the evaluation • Experimental Skills
survey was changed to try to assess this impact quantita- • Written Presentation Skills
tively. • Oral Presentation Skills
• Overall Progress
7 C ASE S TUDIES There are four rubrics available for each rating. From
best-to-worst; “Met or exceeded all specified requirements.”,
Two studies were carried out; the first using a postgrad-
“Substantively met requirements, targeted improvement
uate setup, the second using forms modified to be useful
needed in some areas.”, “Failed to meet requirements
to undergraduate students. Both studies were carried out
in some areas. Significant improvement is required.” and
using staff and students in the School of Computer Science
“Failed to meet requirements for the course.”.
at Bangor University between 2015 and 2016. The number of
For each rubric prompt, there is a corresponding free-
participants in these studies is too small to draw generalized
form prompt for the rationale. The staff form also includes
conclusions. However, they do show positive results.
a five-point Likert scale question to comment on the effec-
tiveness of the supervisory relationship.
7.1 Form Configuration
2. Full details of the survey, methodology and results can be found
In both studies, the configuration of the forms was set at https://www.heacademy.ac.uk/research/surveys/postgraduate-
to match an updated internal paper review process. This research-experience-survey-pres
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 7
7.2 Study Sample 1: Ph.D. Progression 7.3 Study Sample 2: Undergraduate Dissertations
A PPENDIX A A PPENDIX B
P OSTGRADUATE S TUDENT S URVEY DATA U NDERGRADUATE S TUDENT S URVEY DATA
The responses have been split by question type, there are The results have been split by question type, in the same
eight (8) respondents in total. Tables 1 to 3 show categorical way as Appendix A. Tables 5 to 7 show categorical re-
responses. Table 4 contains the Yes/No responses. Table 9 sponses. Table 8 contains the Yes/No responses. Table 10
(page 11) contains all Likert-scale questions. (page 11) contains all Likert-scale questions. There are three
(3) respondents in total.
TABLE 1
Postgraduate Time Spent Responses TABLE 5
Undergraduate Time Spent Responses
Some
Question Little (<30 Lots
(>30min, Some
min) (>2hrs) Little (<30 Lots
<2hrs) Question (>30min,
min) (>2hrs)
1: How much time <2hrs)
3 (37.5%) 4 (50%) 1 (12.5%)
did you spend com- 1: How much time 3 (100%) 0 0
pleting each review did you spend com-
submission? pleting each review
submission?
TABLE 2
Postgraduate Achievement Responses TABLE 6
Undergraduate Achievement Responses
Question 5 or
None 1-2 3-4
more 5 or
Question None 1-2 3-4
3 1 more
6: How many achieve- 0 4 (50%)
ments did you list on (37.5%) (12.5%) 2 1
6: How many achieve- 0 0
the review? ments did you list on (66.6%) (33.3%)
the review?
TABLE 3
Postgraduate Challenges Faced Responses TABLE 7
Undergraduate Challenges Faced Responses
3 or
Question None 1 2
more 3 or
1 3 Question None 1 2
10: How many chal- 2 (25%) 2 (25%) more
(12.5%) (37.5%) 3
lenges did you iden- 10: How many chal- 0 0 0
tify on the review? (100%)
lenges did you iden-
tify on the review?
TABLE 4
Postgraduate Yes/No Responses TABLE 8
Undergraduate Yes/No Responses
Question Yes No
Question Yes No
14: Were there any questions that you 0 8 (100%)
were unable to answer? 14: Were there any questions that you 0 3 (100%)
15: Were there any questions that you were unable to answer?
1 (12.5%) 7 (87.5%)
felt were inappropriate to ask? 15: Were there any questions that you 0 3 (100%)
17: Did the website break or show felt were inappropriate to ask?
0 8 (100%)
any errors while filling out your re- 17: Did the website break or show 3 (100%) 0
views? any errors while filling out your re-
21: I uploaded evidence to the web- views?
5 (62.5%) 3 (37.5%)
site to support the points in my re- 21: I uploaded evidence to the web- 0 3 (100%)
view. site to support the points in my re-
24: Would you have rather used a view.
0 8 (100%)
document-based or paper system in- 24: Would you have rather used a 0 3 (100%)
stead of the website? document-based or paper system in-
stead of the website?
IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES 11
TABLE 9
Postgraduate Likert-Scale Responses
Neither
Question Strongly Mildly Mildly Strongly
Agree Nor
Disagree (1) Disagree (2) Agree (4) Agree (5)
Disagree (3)
2: The reviews helped me gauge my progress. 0 2 (25%) 1 (12.5%) 5 (62.5%) 0
3: Reporting my progress every year helped me keep track of my 0 0 6 (75%) 2 (25%) 0
overall progress.
4: I was motivated to make progress each time because I knew I 0 2 (25%) 3 (37.5%) 3 (37.5%) 0
was being asked to fill in a review.
5: I think I made progress over this review period. 0 0 3 (37.5%) 5 (62.5%) 0
7: I found writing a list of what I had achieved during each period 0 0 4 (50%) 3 (37.5%) 1 (12.5%)
useful.
8: I liked the review process, for whatever reasons. 0 2 (25%) 3 (37.5%) 2 (25%) 1 (12.5%)
9: I found having a confidential survey to discuss the relationship 0 2 (25%) 3 (37.5%) 2 (25%) 1 (12.5%)
with my supervisor to be useful.
11: I found the questions asked difficult to answer. 1 (12.5%) 3 (37.5%) 2 (25%) 2 (25%) 0
12: I didn’t like answering the questions asked of me. 1 (12.5%) 2 (25%) 4 (50%) 1 (12.5%) 0
13: I used the information from previous reviews to complete this 2 (25%) 2 (25%) 1 (12.5%) 2 (25%) 0
one.
19: I found not being able to use the website from home limiting. 0 3 (37.5%) 0 1 (12.5%) 4 (50%)
20: I found the website easy to use. 0 0 2 (25%) 3 (37.5%) 3 (37.5%)
21: I liked the range of options/answers that I was allowed to 0 0 3 (37.5%) 3 (37.5%) 2 (25%)
give.
22: I liked using a web-based system for the monitoring process. 0 0 0 5 (62.5%) 3 (37.5%)
23: Using a website made answering the review and confidential 0 0 0 5 (62.5%) 3 (37.5%)
survey questions easy.
TABLE 10
Undergraduate Likert-Scale Responses
Neither
Question Strongly Mildly Mildly Strongly
Agree Nor
Disagree (1) Disagree (2) Agree (4) Agree (5)
Disagree (3)
2: The reviews helped me gauge my progress. 0 1 (33.3%) 1 (33.3%) 1 (33.3%) 0
3: Reporting my progress every 3 weeks helped me keep track of 0 1 (33.3%) 1 (33.3%) 1 (33.3%) 0
my overall progress.
4: I was motivated to make progress each time because I knew I 1 (33.3%) 1 (33.3%) 0 0 1 (33.3%)
was being asked to fill in a review.
5: I think I made progress between each review. 0 0 1 (33.3%) 2 (66.6%) 0
7: I found writing a list of what I had achieved during each period 0 1 (33.3%) 2 (66.6%) 0 0
useful.
8: I liked the review process, for whatever reasons. 0 2 (66.6%) 0 1 (33.3%) 0
9: I found having a confidential survey to discuss the relationship 0 2 (66.6%) 0 1 (33.3%) 0
with my supervisor to be useful.
11: I found the questions asked difficult to answer. 0 1 (33.3%) 1 (33.3%) 1 (33.3%) 0
12: I didn’t like answering the questions asked of me. 0 3 (100%) 0) 0 0
13: I used the information from previous reviews to complete this 1 (33.3%) 1 (33.3%) 1 (33.3%) 0 0
one.
19: I found not being able to use the website from home limiting. 0 1 (33.3%) 2 (66.6%) 0 0
20: I found the website easy to use. 0 0 1 (33.3%) 2 (66.6%) 0
21: I liked the range of options/answers that I was allowed to 0 0 2 (66.6%) 1 (33.3%) 0
give.
22: I liked using a web-based system for the monitoring process. 0 0 1 (33.3%) 0 2 (66.6%)
23: Using a website made answering the review and confidential 0 0 1 (33.3%) 0 2 (66.6%)
survey questions easy.