Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
32 views68 pages

Rapport

The document outlines a project study focused on the development and implementation of a security information system. It includes sections on project context, methodology, requirements specification, and detailed sprint plans for compliance verification, access rights management, and data extraction. Each sprint is structured with backlogs, functional specifications, and execution plans to ensure effective project management and delivery.

Uploaded by

SINDA BS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views68 pages

Rapport

The document outlines a project study focused on the development and implementation of a security information system. It includes sections on project context, methodology, requirements specification, and detailed sprint plans for compliance verification, access rights management, and data extraction. Each sprint is structured with backlogs, functional specifications, and execution plans to ensure effective project management and delivery.

Uploaded by

SINDA BS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 68

Contents

General Introduction 1

1 Project Study 3
1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Organization Presentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.1 ANCS Identity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.2 Mission of ANCS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.3 Audit directorate of the security informations systems . . . . . . . . . . . . . 5
1.3 Projet Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3.1 Critique of the Existing System . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3.2 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3.3 Existing System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.3.4 Proposed Solution Overview . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.4 Work Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4.1 Agile Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4.2 What is SCRUM? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4.3 Modeling Language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2 Preliminary Study 10
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2 Requirements Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.1 Identification of actors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.2 Identification of Requirements . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.3 Project Management with Scrum . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.1 Scrum Roles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.2 Product Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.3 Global Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.3.4 Sprint Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.4 Work Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.4.1 Software Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.5 Project Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.5.1 Software Architecture Pattern . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.5.2 General Architecture of the Solution . . . . . . . . . . . . . . . . . . . . . . 16

i
Contents

2.6 Deployment Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17


2.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

3 Sprint 1 – Compliance Verification 18


3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.2 Sprint 1 Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.3 Functional Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.3.1 Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.3.2 Textual Description of "Compliance Verification" . . . . . . . . . . . . . . . 21
3.3.3 Textual Description of "Upload Compliant Report" . . . . . . . . . . . . . . 21
3.4 Sequence diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.5 Conception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.5.1 Class Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.5.2 Interaction Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.6 Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

4 Sprint 2 – Access Rights and account Management 29


4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.2 Sprint 2 Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.3 Functional Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.3.1 Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.3.2 Textual Description of " Authentication " . . . . . . . . . . . . . . . . . . . 32
4.3.3 Textual Description of " Manage user accounts and roles " . . . . . . . . . . 34
4.4 Sequence Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.5 Conception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.5.1 Class Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.5.2 Interaction Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.6 Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.6.1 Login and Dashboard Access . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.6.2 Users Management Dashboard . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

5 Sprint 3 – Data extraction 39


5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.2 Sprint 3 Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.3 Functional Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
5.3.1 Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
5.4 Sequence Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
5.5 Conception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
5.5.1 Class Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
5.5.2 Interaction Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

Automated System for Audit Report Verification ii


5.6 Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
5.6.1 Data Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
5.6.2 Data Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
5.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

6 Sprint 4 – Data Analysis and dashboard 49


6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
6.2 Sprint 4 Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
6.3 Functional Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
6.3.1 Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
6.4 Sequence Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
6.5 Conception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
6.5.1 Class Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
6.5.2 Interaction Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
6.6 Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
6.6.1 Statistics Calculation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
6.6.2 Statistics Calculation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
6.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

7 General Conclusion 61

List of Figures

1.1 Scrum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.2 Logo Uml . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.1 Scrum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

3.1 Use Case Diagram - Sprint 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20


3.2 Sequence Diagram - Sprint 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.3 Class Diagram - Sprint 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.4 interaction diagram - Sprint 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.5 Step1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.6 Step1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.7 Step1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.8 Step1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.9 Step1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.10 Step1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

iii
4.1 Use Case Diagram - Sprint 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
4.2 Sequence Diagram - Sprint 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.3 Class Diagram - Sprint 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.4 Interaction Diagram - Sprint 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.5 Login Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.6 Admin Login . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.7 Admin Dashboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

5.1 Use Case Diagram – Sprint 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41


5.2 Sequence Diagram - Sprint 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
5.3 Class Diagram - Sprint 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
5.4 Interaction Diagram - Sprint 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
5.5 Login Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
5.6 Admin Dashboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

6.1 Use Case Diagram – Sprint 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51


6.2 Sequence Diagram - Sprint 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
6.3 Class Diagram - Sprint 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
6.4 Interaction Diagram - Sprint 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
6.5 Login Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
6.6 Admin Dashboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

List of Tables

2.1 Actors and their Roles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11


2.2 Project Roles and Responsibilities . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.3 Product backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

3.2 Textual Description of "Compliance Verification" . . . . . . . . . . . . . . . . . . . 21


3.3 Textual Description of "Upload Compliant Report" . . . . . . . . . . . . . . . . . . 21

4.2 Use Case: Authentication (Login) . . . . . . . . . . . . . . . . . . . . . . . . . . . 32


4.3 Use Case: Manage user accounts and roles . . . . . . . . . . . . . . . . . . . . . . . 34

5.2 Use Case: Data Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41


5.4 Use Case: Database Population . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

6.2 Use Case: Generate Dashboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52


6.3 Use Case: View Audit History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
6.4 Use Case: Create Indicator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

iv
List of Tables

6.5 Use Case: Generate Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55


6.6 Use Case: Generate Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

Automated System for Audit Report Verification v


BIBLIOGRAPHY

[1] National Cybersecurity Agency. Presentation of the ancs. https://www.ancs.tn, 2023. Ac-
cessed May 8, 2025.

vi
GENERAL INTRODUCTION

In a context where digital technologies are evolving at an unprecedented pace, ensuring the security
of information systems has become one of the most pressing challenges for organizations across all
sectors. The growing complexity and frequency of cyber threats place increasing pressure on public
institutions to adopt proactive and intelligent cybersecurity strategies. In Tunisia, the National Cy-
bersecurity Agency (ANCS) plays a central role in safeguarding national digital infrastructure and
promoting compliance with cybersecurity regulations among both public and private entities. One of
the core activities undertaken by the agency is the supervision and evaluation of audit reports submit-
ted by organizations as part of cybersecurity assessments. These reports are crucial for determining
the extent to which an institution complies with security standards and best practices. However, the
current processes used to handle and analyze these reports are largely manual, leading to significant
inefficiencies, delays, and a heightened risk of human error. As the volume of audit reports contin-
ues to grow and the need for timely and accurate analysis becomes more urgent, it is essential to
modernize these workflows by introducing automation and intelligent data management tools.

This end of study project addresses these issues by proposing the design and incremental develop-
ment of an automated system dedicated to optimizing the management of cybersecurity audit reports.
The overall objective is to create a scalable and secure platform that will enhance ANCS’s ability
to process audit data efficiently, enforce compliance verification, and provide real-time analytical in-
sights. To ensure a structured and iterative approach, the project has been divided into four main
development sprints, each focusing on a distinct functional domain. The first sprint, titled Compli-
ance Verification, is concerned with enabling users regardless of authentication status to upload audit
reports and receive automated feedback on whether the structure conform to predefined regulatory
requirements. This step is essential for streamlining the initial validation phase and reducing the
workload of agency staff. The second sprint, Access Rights and Account Management, introduces
a secure authentication system and a framework for managing user roles and permissions. This en-
sures that sensitive operations within the platform are accessible only to authorized users, such as
administrators, Audit Supervisors, and study officers, in accordance with their responsibilities.

1
General Introduction

Building upon this foundation, the third sprint, Data Extraction, which focuses on automating
the extraction of structured data from audit reports, enabling validation by Study Officers, and se-
curely storing the data and notes in a database.Automating this process reduces the risk of error and
facilitates faster access to essential cybersecurity metrics. Finally, the fourth sprint, Data Analysis
and Dashboard, aims to transform the collected data into meaningful insights by offering an interface
for real-time visualization and reporting. Through dynamic dashboards and customizable statistics,
decision makers will be able to track security trends, identify recurring issues, and support strategic
interventions more effectively. The entire project is guided by the Agile Scrum methodology, which
allows for flexibility, user feedback integration, and incremental delivery of functionalities. This
approach is particularly well-suited to complex system development, as it encourages continuous
improvement and responsiveness to change.

The present report provides a comprehensive overview of the project’s context, objectives, and
expected contributions. It begins with a detailed analysis of the limitations of the current manual sys-
tem, followed by a justification for automation and a description of the proposed solution. Subsequent
sections describe the methodology adopted, the roles of different system actors, and the specifications
for each sprint. By the end of this project, the developed platform is expected to significantly enhance
the efficiency and reliability of audit report management at ANCS, contributing to a stronger and
more agile national cybersecurity posture.

Automated System for Audit Report Verification 2


CHAPTER 1

PROJECT STUDY

Contents
1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Organization Presentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.1 ANCS Identity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.2 Mission of ANCS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.3 Audit directorate of the security informations systems . . . . . . . . . . . 5
1.3 Projet Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3.1 Critique of the Existing System . . . . . . . . . . . . . . . . . . . . . . . 5
1.3.2 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3.3 Existing System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.3.4 Proposed Solution Overview . . . . . . . . . . . . . . . . . . . . . . . . 6
1.4 Work Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4.1 Agile Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4.2 What is SCRUM? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4.3 Modeling Language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

3
Chapter 1. Project Study

1.1 Introduction

In an era where cybersecurity is a national priority, the effective handling of audit reports is essential to
maintaining the integrity of digital infrastructures. These reports serve as key tools for assessing com-
pliance and identifying vulnerabilities across various organizations. However, traditional methods of
managing them often manual and fragmented struggle to keep pace with the volume and complexity
of modern data.

This project addresses these limitations by developing a web-based system tailored for the Na-
tional Cybersecurity Agency (ANCS), aimed at optimizing the analysis of audit reports. The system
is designed to automate critical tasks such as compliance checking, data extraction, and the generation
of dashboards, all while ensuring secure user access. This chapter introduces the host organization
and its missions, explains the challenges of the existing system, and presents the proposed solution
along with the methodology chosen to guide its development.

1.2 Organization Presentation

This part of the chapter introduces the National Cybersecurity Agency (ANCS), focusing on its in-
stitutional profile, its main cybersecurity missions, and the specialized duties of the department in
charge of auditing information systems.

1.2.1 ANCS Identity

[1] The National Cybersecurity Agency (ANCS) is the Tunisian authority in charge of safeguarding
information and communication systems at the national level. Created under Decree-Law No. 2023-
17 of March 11, 2023, the agency is entrusted with ensuring digital sovereignty, protecting critical
infrastructure, and supporting the secure digital transformation of both public and private entities.

1.2.2 Mission of ANCS

[1] The main missions of ANCS include:

• Enforcing cybersecurity regulations and compliance across sectors.

• Conducting security audits and evaluating risks in information systems.

• Managing and coordinating responses to cybersecurity incidents.

• Promoting cybersecurity culture through awareness and capacity building.

• Developing national cybersecurity strategies and public policy recommendations.

Automated System for Audit Report Verification 4


Chapter 1. Project Study

1.2.3 Audit directorate of the security informations systems

The Directorate of Information Systems Security Audit within ANCS is responsible for a range of
critical tasks aimed at enhancing the security of information systems. These include:

• Planning and carrying out cybersecurity audits across various sectors.

• Identifying security gaps and evaluating cyber risks.

• Compiling and delivering reports on compliance and security assessments.

• Suggesting improvements for the continuous enhancement of IT security protocols.

1.3 Projet Context

In this section, we examine the current challenges faced by the National Cybersecurity Agency
(ANCS) in managing audit reports, critique the existing manual system, and propose a solution lever-
aging advanced technologies to enhance efficiency and accuracy.

1.3.1 Critique of the Existing System

• Manual and Error-Prone: The manual review process introduces human error, such as over-
looking key details or misclassifying data.

• Lack of Automation: Automation could significantly speed up data processing and ensure con-
sistency. Without automated tools, experts spend too much time on repetitive tasks, rather than
focusing on the analysis and insights.

• Limited Real-Time Monitoring: Without centralized dashboards, the progress of audits and the
results of previous ones are not easily accessible. This hinders tracking and makes it difficult to
derive insights at a glance.

• Limited Statistical Analysis: The current system lacks the capability to perform complex anal-
yses and generate actionable insights, such as cross-organizational comparisons, sector-level
analysis, or tracking security improvements over time.

1.3.2 Problem

The ANCS currently manages audit reports through a manual process that heavily depends on human
oversight. These reports encompass both structured data (e.g., tables) and unstructured data (e.g.,

Automated System for Audit Report Verification 5


Chapter 1. Project Study

text, images), making data extraction and analysis complex and time-consuming. Auditors are re-
quired to manually compare these reports against predefined standards set by the ANCS, leading to
inefficiencies and a higher likelihood of errors.

1.3.3 Existing System

Currently, audit reports are managed manually, relying heavily on human oversight. The reports con-
tain both structured (e.g., tables) and unstructured (e.g., text, images) information, making the extrac-
tion of data a complex task. Audit teams must manually compare reports with predefined standards
(the ANCS model), which is inefficient and prone to errors.

1.3.4 Proposed Solution Overview

The proposed solution aims to modernize and automate the audit report analysis process for ANCS,
addressing the inefficiencies, inconsistencies, and errors of the current manual system. By imple-
menting an automated audit report verification system, this solution will enhance both the accuracy
and efficiency of the audit process. It will take the form of a web application that centralizes the entire
audit analysis workflow.

Objectives and Goals Automated Compliance Verification One of the primary goals is to automate
the process of verifying whether audit reports comply with ANCS’s defined standards. This will
allow auditors to spend less time on manual checks and focus more on analyzing the content of the
reports. Compliance verification will cover both the structure and content of the reports, ensuring
that all required sections (e.g., organization description, security indicators) are present and adhere to
predefined formats.

Structured Data Extraction Extracting relevant information from audit reports is currently a time-
consuming task prone to human error. The new automated system will efficiently identify and struc-
ture key data points, such as an organization’s security maturity level, the description of its IT infras-
tructure, and its compliance status with regulatory requirements. This will reduce processing time
and minimize errors.

Data Analysis and Synthesis The system will include advanced data analysis capabilities, allowing
auditors and decision-makers to quickly identify trends, patterns, and anomalies within audit reports.
Instead of manually reviewing large volumes of documents, the platform will centralize information
and provide automated statistical analysis. This approach is essential for detecting recurring vulnera-
bilities and guiding strategic cybersecurity decisions.

Real-Time Reporting and Dashboards With real-time data integration, the system will generate
dynamic dashboards and automated reports, offering instant access to audit findings. These dash-
boards will be customizable, enabling users to focus on specific metrics, such as security levels across

Automated System for Audit Report Verification 6


Chapter 1. Project Study

different organizations or the frequency of specific compliance issues. This feature will facilitate
decision-making, real-time audit monitoring, and a faster response to emerging risks.

User Management and Data Security The system will include advanced user management features
and robust security protocols, ensuring that only authorized personnel can access sensitive audit data
and reports. Specific roles (e.g., auditors, managers, administrators) will be assigned with tailored
access levels, ensuring an efficient workflow while maintaining data security.

1.4 Work Methodology

In this section, we outline the work methodology adopted for our project. We have chosen an object-
oriented design approach to ensure efficient organization and to simplify complex tasks. Within this
framework, we have implemented the SCRUM methodology, a well-established Agile approach.

1.4.1 Agile Methods

Among various project management methodologies, we have selected the Agile methodology due to
its professional advantages in project steering. Agile divides projects into smaller phases and guides
teams through cycles of planning, implementation, and evaluation. This approach offers several ben-
efits, including:

• Rapid Progression: Agile’s iterative nature allows for quick adjustments and continuous deliv-
ery of project components.

• Alignment of Clients and Stakeholders: Regular feedback loops ensure that the project remains
aligned with client expectations and stakeholder requirements.

• Continuous Improvement: Frequent evaluations promote ongoing enhancements and refine-


ments throughout the project lifecycle.

1.4.2 What is SCRUM?

SCRUM is an empirical process where decisions are based on observation, experience, and experi-
mentation. It is founded on three pillars: transparency, inspection, and adaptation, fostering an itera-
tive workflow. Empiricism involves conducting small experiments, learning from them, and adapting
actions and methods as needed.

The SCRUM team cultivates trust, essential for preventing tensions and obstacles. Values such
as courage, focus, commitment, respect, and openness guide collaboration and are crucial in environ-
ments centered on experimentation.

Automated System for Audit Report Verification 7


Chapter 1. Project Study

Key characteristics of SCRUM include:

• Maximizing Value While Minimizing Costs: Prioritizing tasks to deliver the most value effi-
ciently.

• Product Backlog Managed by a Single Individual: A prioritized list of requirements overseen


by the Product Owner.

• Self-Organizing Teams with Development Cycles of 1 to 4 Weeks (Sprints): Teams plan and
execute work in short, iterative cycles.

• Daily Meetings to Synchronize Work (Daily Stand-ups): Brief sessions to align team members
and address any impediments.

Figure 1.1: Scrum

1.4.3 Modeling Language

Graphical modeling languages, such as the Unified Modeling Language (UML), utilize diagrams to
represent concepts and their relationships. We have selected UML for our project to leverage its
visual clarity and standardization, facilitating communication among developers and ensuring precise
system design.

Figure 1.2: Logo Uml

Automated System for Audit Report Verification 8


Chapter 1. Project Study

1.5 Conclusion

This chapter provides an overview of the general context and the institution hosting the project, while
by exploring the fundamental concepts necessary for its understanding. In the next chapter, we will
discuss the analysis of the functional and non-functional needs of the project, as well as as the estab-
lishment of the forecast schedule

Automated System for Audit Report Verification 9


CHAPTER 2

PRELIMINARY STUDY

Contents
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2 Requirements Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.1 Identification of actors . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.2 Identification of Requirements . . . . . . . . . . . . . . . . . . . . . . . 11
2.3 Project Management with Scrum . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.1 Scrum Roles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.2 Product Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.3 Global Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.3.4 Sprint Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.4 Work Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.4.1 Software Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.5 Project Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.5.1 Software Architecture Pattern . . . . . . . . . . . . . . . . . . . . . . . 16
2.5.2 General Architecture of the Solution . . . . . . . . . . . . . . . . . . . . 16
2.6 Deployment Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

10
Chapter 2. Preliminary Study

2.1 Introduction

This chapter presents the initiation phase, which represents the first step in the realization of our
project. We begin with a preliminary analysis, identifying key actors and listing the system’s func-
tional and non-functional requirements, following the methodology previously chosen in the pre-
ceding chapter. Next, we delve into project planning, highlighting the importance of the initiation
phase within the Scrum methodology we have adopted. This initial phase is a crucial period of
exploration, organization, and definition of functionalities through the establishment of the Product
Backlog. Finally, we provide a synthesis of this initial phase, reaffirming our commitment to a precise
methodological approach and detailing the technological environment chosen to support our work.

2.2 Requirements Specification

This section provides an overview of the National Cybersecurity Agency (ANCS), detailing its iden-
tity, mission, and the specific responsibilities of its Directorate of Information Systems Security Audit.

2.2.1 Identification of actors

An actor refers to an entity, whether a person or another system, that interacts with the system by
exchanging information (input and/or output).

Table 2.1: Actors and their Roles

Actor Roles
Administrator Manages user accounts, roles, and system configurations
Non-authentified userr Uploads audit reports and verifies compliance with ANCS model.
Study Officer Reviews and validates compliance analysis results.
Audit Supervisor Oversees the audit process and compliance verification.
Decision Maker Uses compliance reports to make policy or security decisions.

2.2.2 Identification of Requirements

During the analysis and requirements specification phase, our goal is to gain an in-depth understand-
ing of all the components of our project while carefully identifying and examining potential vulner-
abilities. This step allows us to precisely define both functional and non-functional requirements,
which helps us structure our work effectively.

Automated System for Audit Report Verification 11


Chapter 2. Preliminary Study

Functional Requirements

The objective of our project is to address a specific need, which requires a clear articulation of this
need before proposing a solution. Functional requirements define the features and actions that the
system must be able to execute to meet user or client expectations. These requirements are identified
by determining the tasks and functions necessary for the system to fulfill its primary purpose. They are
crucial in ensuring that the system effectively and reliably meets user expectations. After analyzing
the existing system, we have identified the necessary functional requirements: -

Non-Functional Requirements

The specification of non-functional requirements aims to identify the constraints and characteristics
that the system must adhere to, without describing specific functionalities. These requirements are not
tied to a particular use case but define the overall system’s quality attributes. The key non-functional
requirements include:

2.3 Project Management with Scrum

2.3.1 Scrum Roles

In this section, we introduce the different stakeholders involved in the various stages of the project
and the development of the internship report. For our project, the development team consists of two
members responsible for carrying out the project from conception to implementation. The key roles
in the Scrum methodology include:

• Product Owner: Acts as the representative of the users, defining requirements, prioritizing fea-
tures, and overseeing the development team to ensure the product aligns with user expectations.

• Scrum Master: Facilitates the project workflow, ensuring the Scrum framework is followed,
removing obstacles, and fostering a productive and collaborative atmosphere within the team

• Development Team: Responsible for the actual implementation of the project, handling coding,
testing, and deployment based on the defined backlog and sprint planning.

2.3.2 Product Backlog

The product backlog is a prioritized and continuously updated list of features, enhancements, and
tasks required for a product. This essential tool for agile teams helps organize work, focus on critical
elements, and communicate effectively with stakeholders

Automated System for Audit Report Verification 12


Chapter 2. Preliminary Study

Table 2.2: Project Roles and Responsibilities

Role Responsibility Actor


Development Team
• Design • Aymen Gasri

• Development • Sinda Ben Said

• Testing and validation

Product Owner haythem slimane


• Define requirements

• Prioritize features

• Validate deliverables

Scrum Master sana ghanney


• Ensure adherence to
Scrum practices

• Shield the team from


external interferences

• Facilitate meetings and


communication

Table 2.3: Product backlog

Feature User Story User Role Priority


Submit and As a non-authenticated user, I want Non- High
Check Com- to submit a report and check its authenticated
pliance of a compliance with the ANCS model, User
Report so that I can determine if the report
meets the standards.
Login As an ANCS staff member , I want Administrator , High
to log in to the system with my Decision Maker ,
credentials so that I can access my Audit Supervisor
functionalities . , Study Officer
Access Rights As an Administrator, I want to cre- Administrator High
ate, delete, and assign roles to users
so that each person accesses only
the appropriate modules.

Automated System for Audit Report Verification 13


Chapter 2. Preliminary Study

Feature User Story User Role Priority


Extract Data from As a Study Officer, I want to ex- Study Officer High
Audit Reports tract data from audit reports through
the management interface, so that I
can store the data in the database for
further analysis.
Create Indicators As a Supervisor or Decision Maker, Supervisor or De- High
I want to create simple and complex cision Maker
indicators from audit data to assess
trends.
Dashboard for As a Decision Maker, I want to gen- Decision Maker High
data analysis erate dashboards and export audit
insights in PDF or CSV formats for
communication.

Automated System for Audit Report Verification 14


Chapter 2. Preliminary Study

2.3.3 Global Use Case Diagram

The various use cases of our application illustrate the overall functionalities offered by our system, as
shown in

Figure 2.1: Scrum

2.3.4 Sprint Planning

• Sprint 1: Compliance Verification

• Sprint 2: Access Rights and account Management

• Sprint 3:Data extraction

• Sprint 4:Data Analysis and dashboard

2.4 Work Environment

The choice of the work environment plays a crucial role in the project’s success. It includes the
software, frameworks, and development tools necessary for implementing the solution

Automated System for Audit Report Verification 15


Chapter 2. Preliminary Study

2.4.1 Software Environment

The software environment used for the development of the solution is as follows:

Operating System: Windows 10/11 and Linux (Ubuntu 22.04)


Backend:
Language: Python
Framework: Flask REST Framework DRF)
Database:PostgreSQL
Frontend:
Language: JavaScript/TypeScript
Framework: React.js with CSS for design
Authentication:
JWT (JSON Web Token) for secure session management
Version Control:
Git GitHub for source code management
Development Tools:
Visual Studio Code (VS Code) / PyCharm for coding Postman for API testing Docker for service
containerization
Project Management Methodology:
Scrum with tracking via Jira and Trello

2.5 Project Architecture

2.5.1 Software Architecture Pattern

The solution follows an MVC (Model-View-Controller) architecture, suitable for modern web appli-
cations:
Model: Manages data and interacts with the database.
View: A React.js user interface, ensuring a smooth experience.
Controller: API REST using Flask REST Framework , handling communication between the frontend
and backend. Additionally, the architecture follows a microservices approach, enhancing scalability
and maintainability.

2.5.2 General Architecture of the Solution

The general architecture of the solution consists of multiple interconnected components:


Web Interface (Frontend): Developed with React.js, allowing users to interact with the system via an

Automated System for Audit Report Verification 16


Chapter 2. Preliminary Study

intuitive interface.
API REST (Backend): Built with Flask, it processes frontend requests and manages database access.
Database (PostgreSQL): Stores audit reports, users, and analysis results.
Verification Engine: Performs automated audit report processing.

Authentication User Management: Secured via JWT and role-based access (administrators, au-
ditors, decision-makers).
Dynamic Dashboards: Generated from analysis results, enabling real-time decision-making.

2.6 Deployment Diagram

The deployment diagram illustrates the technical infrastructure of the application. It is based on a
Dockerized deployment, ensuring increased flexibility and optimal scalability.
Deployment Infrastructure: Backend Server: Flask + API REST container
Frontend Server: React.js container Database: PostgreSQL with secured access management
Storage: File server for audit reports Authentication: Middleware JWT for secure sessions
Hosting: Deployment on a cloud server (AWS or DigitalOcean)

2.7 Conclusion

This chapter has outlined the work environment and system architecture. The adoption of an MVC
architecture based on microservices and the use of Docker ensure better modularity and scalability.
The next chapter will detail the implementation of the first sprint and its impact on project progress.

Automated System for Audit Report Verification 17


CHAPTER 3

SPRINT 1 – COMPLIANCE VERIFICATION

Contents
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.2 Sprint 1 Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.3 Functional Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.3.1 Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.3.2 Textual Description of "Compliance Verification" . . . . . . . . . . . . . 21
3.3.3 Textual Description of "Upload Compliant Report" . . . . . . . . . . . . 21
3.4 Sequence diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.5 Conception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.5.1 Class Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.5.2 Interaction Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.6 Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

18
Chapter 3. Sprint 1 – Compliance Verification

3.1 Introduction

In this chapter, we present the first version of our application, which includes the first sprint entitled
"Verification of Audit Report Compliance with the ANCS Model". This sprint focuses on implement-
ing the basic functionality that allows both authenticated and non-authenticated users to interact with
the system. The sprint includes user authentication, report uploading, compliance verification, and
displaying results. We will detail the sprint backlog, perform use case analysis, and include UML
diagrams, followed by a demonstration of the implemented interface components..

3.2 Sprint 1 Backlog

The table below provides a detailed overview of the sprint backlog for the compliance verification
module.

Id Feature User Story Duration


(Days)
1 Compliance As a Guest (non-authenticated user), I 2
Verification want to upload an audit report so i can
verify its compliance with standard rules.
2 Upload Compliant As a Guest, after a successful compliance 3
Report check, I want to upload the compliant
report along with credentials.

3.3 Functional Specification

This section introduces the analysis phase that answers the question "What do developers do?". The
answer to this question remains in the presentation of the use case diagram and the textual description
in between.

3.3.1 Use Case Diagram

This subsection presents the refinement of each feature use case with the textual description.

Refinement of "Compliance Verification" Use Case

The Figure 3.1 shows the refinement of the "Check Compliance" use case.

Automated System for Audit Report Verification 19


Chapter 3. Sprint 1 – Compliance Verification

Figure 3.1: Use Case Diagram - Sprint 1

Automated System for Audit Report Verification 20


Chapter 3. Sprint 1 – Compliance Verification

3.3.2 Textual Description of "Compliance Verification"

Title Compliance Verification


Actors Guest
Pre-condition Guest has an audit report to upload.
Main Scenario
1. Guest navigates to the compliance check interface.

2. Guest selects the report.

3. The system analyzes the structure/content of the re-


port.

4. The system returns a result: Compliant or Not Com-


pliant.

Exceptions
• If the report is missing required sections, the system re-
ports it.

Post-condition Guest receives compliance status. If compliant, the user


may proceed to upload the report.

Table 3.2: Textual Description of "Compliance Verification"

3.3.3 Textual Description of "Upload Compliant Report"

Title Upload Compliant Report


Actors Guest
Pre-condition Audit report has been validated as compliant.
Main Scenario
1. The Guest enters basic credentials (e.g., email, orga-
nization).

2. The Guest uploads the report.

3. The system stores the report securely.

Exceptions
• Missing credentials prompt an error message.

• Upload fails due to connectivity issues.

Post-condition Report is successfully uploaded .

Table 3.3: Textual Description of "Upload Compliant Report"

Automated System for Audit Report Verification 21


Chapter 3. Sprint 1 – Compliance Verification

3.4 Sequence diagram

The Figure 3.2 represents the "Check Compliance" sequence diagram.

Figure 3.2: Sequence Diagram - Sprint 1

3.5 Conception

3.5.1 Class Diagram

In this section, we present the class diagram for this sprint.

Automated System for Audit Report Verification 22


Chapter 3. Sprint 1 – Compliance Verification

Figure 3.3: Class Diagram - Sprint 1

3.5.2 Interaction Diagram

Automated System for Audit Report Verification 23


Chapter 3. Sprint 1 – Compliance Verification

Figure 3.4: interaction diagram - Sprint 1

3.6 Execution

This section is dedicated to showing screenshots of the completed work of this sprint. Check Com-
pliance Interface The guest user is invited to upload a report file for compliance verification as shown

Automated System for Audit Report Verification 24


Chapter 3. Sprint 1 – Compliance Verification

Step 1: Access the Compliance Verification Interface The user opens the application as a Guest
(i.e., without logging in). The homepage provides an interface that allows uploading an audit report
for compliance verification.

Figure 3.5: Step1

Automated System for Audit Report Verification 25


Chapter 3. Sprint 1 – Compliance Verification

Step 2: Upload an Audit Report The Guest selects and uploads a PDF file through the form. A
request is sent to the backend for compliance analysis.

Figure 3.6: Step1

Step 3: Receive Compliance Result Once the backend processes the report, the system displays
whether it is compliant or not. If non-compliant, an error message is shown and the process ends.

Figure 3.7: Step1

If the report is compliant:

Automated System for Audit Report Verification 26


Chapter 3. Sprint 1 – Compliance Verification

Figure 3.8: Step1

Step 4: Upload Report with Credentials If the report is compliant, the Guest is prompted to upload
it again and provide personal identification credentials (e.g., name, email, etc.).

Figure 3.9: Step1

Step 5: Final Confirmation The backend validates the credentials and stores the report. A confir-
mation message is displayed indicating successful upload.

Automated System for Audit Report Verification 27


Chapter 3. Sprint 1 – Compliance Verification

Figure 3.10: Step1

3.7 Conclusion

At the end of this first sprint, we successfully completed the design and implementation of the compli-
ance verification module. At this stage, a Guest user can access the platform without authentication,
upload an audit report, and automatically verify its compliance with predefined rules. If the report is
compliant, the guest is then able to submit it along with their personal credentials for secure storage.
This sprint laid the foundation for the core functionality of the platform and ensured a seamless and
secure interaction between the guest and the compliance verification engine.

Automated System for Audit Report Verification 28


CHAPTER 4

SPRINT 2 – ACCESS RIGHTS AND ACCOUNT MANAGEMENT

Contents
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.2 Sprint 2 Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.3 Functional Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.3.1 Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.3.2 Textual Description of " Authentication " . . . . . . . . . . . . . . . . . . 32
4.3.3 Textual Description of " Manage user accounts and roles " . . . . . . . . 34
4.4 Sequence Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.5 Conception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.5.1 Class Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.5.2 Interaction Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.6 Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.6.1 Login and Dashboard Access . . . . . . . . . . . . . . . . . . . . . . . . 36
4.6.2 Users Management Dashboard . . . . . . . . . . . . . . . . . . . . . . . 37
4.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

29
Chapter 4. Sprint 2 – Access Rights and account Management

4.1 Introduction

This sprint focuses on implementing access rights within the application. As multiple user roles
interact with the system, it is important to control who can access which features. The goal is to
ensure that each user—whether a Guest, Study Officer, Certified Auditor, or ANCS Staff—can only
perform actions relevant to their role. In this sprint, we developed user authentication and role-based
access management. This helps secure the platform and ensures a clear separation of responsibilities.
The following sections detail the sprint backlog, use case analysis, UML diagrams, and interface
demonstrations.

4.2 Sprint 2 Backlog

The following table outlines the features selected for Sprint 2, which is dedicated to implementing
user access management and role-based permissions. Each item is presented with its user story and
estimated development time.

Id Feature User Story Duration


(Days)
1 User Authentication As a user, I want to log in and log out 2
securely so that I can access my account
and perform role-specific actions.
2 Role Assignment As an administrator, I want to assign a 2
role to each user so that access rights are
correctly managed.
3 Permissions As a user, I should only be able to access 3
Management features that match my role to prevent
unauthorized actions.

4.3 Functional Specification

4.3.1 Use Case Diagram

The figure below illustrates the different use cases related to access control.

Automated System for Audit Report Verification 30


Chapter 4. Sprint 2 – Access Rights and account Management

Figure 4.1: Use Case Diagram - Sprint 2

Automated System for Audit Report Verification 31


Chapter 4. Sprint 2 – Access Rights and account Management

4.3.2 Textual Description of " Authentication "

Table 4.2: Use Case: Authentication (Login)

Use Case ID UC1


Use Case Name Authentication (Login)
Actor(s) Administrator , Decision Maker, Study Officer, Audit Su-
pervisor,
Description This use case allows registered users to securely access the
system by providing valid credentials. Upon successful au-
thentication, users are redirected to the dashboard or mod-
ule corresponding to their assigned role.
Preconditions The user must already have a valid account in the system.
Postconditions The user is authenticated and has access to modules and
data authorized for their role.
Trigger The user enters their username and password on the login
interface and clicks “Login”.
Main Flow
1. The user navigates to the login page.

2. The system prompts for username and password.

3. The user submits credentials.

4. The system verifies credentials.

5. If valid, the system redirects to the appropriate mod-


ule/dashboard.

Alternative Flows AF1: Invalid Credentials

• System detects incorrect credentials.

• Error message is displayed.

• User remains on login page.

AF2: Account Locked

• After multiple failed attempts, account is locked.

• User is notified to contact the administrator.

Security Considera-
tions
• Passwords are encrypted in the database.

• Login attempts are logged.

• Sessions expire after inactivity (e.g., 15 minutes).

Automated System for Audit Report Verification 32


Chapter 4. Sprint 2 – Access Rights and account Management

Automated System for Audit Report Verification 33


Chapter 4. Sprint 2 – Access Rights and account Management

4.3.3 Textual Description of " Manage user accounts and roles "

Table 4.3: Use Case: Manage user accounts and roles

Use Case ID UC2


Use Case Name Manage user accounts and roles
Actor(s) Administrator
Description This use case allows the Administrator to create, update,
and delete user accounts. It also includes assigning appro-
priate roles to each user, which define their access permis-
sions within the system.
Preconditions The administrator must be authenticated and have access to
the user management interface.
Postconditions User accounts are created, updated, or removed as needed.
Each user is assigned a role that governs their system ac-
cess.
Trigger The administrator logs into the system and, from the dash-
board interface (admin homepage), selects the menu or but-
ton labeled as ’Users’ to start creating, modifying, or delet-
ing user accounts.
Main Flow
1. Administrator accesses the "Manage Users" interface.

2. Selects the action: Create, Update, or Delete a user.

3. For creation: enters user information (name, email,


username, password).

4. Assigns a role to the user (e.g., Certified Auditor,


Study Officer, ANCS Staff).

5. Confirms the operation.

6. System stores the changes and confirms success.

Alternative Flows AF1: Duplicate Username

• System detects that the username already exists.

• Error message prompts for a different username.

AF2: Role Assignment Error

• Administrator tries to assign a non-existent role.

• System displays an error and blocks the action.

Security Considera-
tions
• Only authenticated administrators can access this use
case.

• Input validation is enforced on all fields.

• Audit logs are recorded for each change in user data


or roles.
Automated System for Audit Report Verification 34
Chapter 4. Sprint 2 – Access Rights and account Management

4.4 Sequence Diagram

Figure 4.2: Sequence Diagram - Sprint 2

4.5 Conception

4.5.1 Class Diagram

Figure 4.3: Class Diagram - Sprint 2

4.5.2 Interaction Diagram

Automated System for Audit Report Verification 35


Chapter 4. Sprint 2 – Access Rights and account Management

Figure 4.4: Interaction Diagram - Sprint 2

4.6 Execution

Screenshots below illustrate the functionalities developed in this sprint.

4.6.1 Login and Dashboard Access

Automated System for Audit Report Verification 36


Chapter 4. Sprint 2 – Access Rights and account Management

Figure 4.5: Login Interface

Figure 4.6: Admin Login

4.6.2 Users Management Dashboard

Figure 4.7: Admin Dashboard

Automated System for Audit Report Verification 37


Chapter 4. Sprint 2 – Access Rights and account Management

4.7 Conclusion

This sprint introduced user authentication, and secure role-based access management. By imple-
menting these features, the system ensures that only authorized users can perform certain operations,
thereby strengthening the overall security and integrity of the platform.

Automated System for Audit Report Verification 38


CHAPTER 5

SPRINT 3 – DATA EXTRACTION

Contents
5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.2 Sprint 3 Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.3 Functional Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
5.3.1 Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
5.4 Sequence Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
5.5 Conception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
5.5.1 Class Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
5.5.2 Interaction Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
5.6 Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
5.6.1 Data Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
5.6.2 Data Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
5.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

39
Chapter 5. Sprint 3 – Data extraction

5.1 Introduction

This chapter outlines the implementation of Sprint 3, which focuses on automating the extraction of
structured data from audit reports, enabling validation by Study Officers, and securely storing the data
and notes in a database. The sprint addresses the challenges of manual data entry, such as inefficiency
and human error, by leveraging text extraction tools and structured workflows.

5.2 Sprint 3 Backlog

The following table outlines the features selected for Sprint 3. Each feature is described with its user
story and estimated development time.

Id Feature User Story Duration


(Days)
1 PDF Data As a Study Officer, I want the system to 3
Extraction automatically extract audit dates and findings
from PDFs to avoid manual data entry.
2 Data Validation As a Study Officer, I want to edit and save 2
Interface extracted data via a form, which will update
the database.
3 Notes As a Study Officer, I want to submit contextual 3
Submission notes for an audit report, which are stored in
Interface the database alongside the extracted data
4 Database As the System, I need to store extracted data 2
Integration and notes in relational tables to ensure data
integrity and enable future queries.
5 Role-Based As a Responsable, I want to view audit reports 3
Access Control and notes in a secure dashboard without
modifying data to maintain compliance.
6 Report As a Responsable, I want to click a "Complete 4
Completion Report" button to finalize the audit, locking
Workflow edits and ensuring both roles can view the
combined data and notes.

Automated System for Audit Report Verification 40


Chapter 5. Sprint 3 – Data extraction

5.3 Functional Specification

5.3.1 Use Case Diagram

The following diagram illustrates the different use cases related to data extraction and population.

Figure 5.1: Use Case Diagram – Sprint 3

Textual Description of "Data Extraction"

Table 5.2: Use Case: Data Extraction

Use Case ID UC1


Use Case Name Data Extraction
Actor(s) Study Officer
Description This use case allows the Study Officer to extract relevant
data from the audit report. The system identifies and ex-
tracts data based on the predefined structure of the audit re-
port. The extracted data is then displayed in the interface
for the Study Officer to review, modify, or validate.
Preconditions The Study Officer must be authenticated and logged into
the system.
An audit report must be available for extraction.

Postconditions The extracted data is displayed in the user interface, ready


for modification and validation.

Trigger The Study Officer selects an audit report from the manage-
ment interface and initiates the data extraction process.

Automated System for Audit Report Verification 41


Chapter 5. Sprint 3 – Data extraction

Main Flow
1. The Study Officer selects an audit report for extrac-
tion.

2. The system parses the report based on the predefined


model structure.

3. The extracted data is displayed in the user interface


for review.

4. The Study Officer reviews and makes necessary mod-


ifications to the data.

Alternative Flows AF1: No Data Found

• If the report does not contain the expected data, the


system notifies the Study Officer and highlights the
missing or erroneous sections.

Security Considera- The data extraction process ensures that sensitive informa-
tions tion is handled securely during the extraction phase.
Only authorized Study Officers can initiate the data extrac-
tion process.

Textual Description of " Data Validation"

Use Case ID UC2


Use Case Name Data Validation
Actor(s) Study Officer
Description This use case allows the Study Officer to validate the ex-
tracted data. After reviewing the extracted data, the Study
Officer can modify the values if needed and then validate
the information to ensure its correctness before storing it in
the database. The validation step helps maintain the accu-
racy of the data entered into the system.
Preconditions The Study Officer must have completed the data extraction
process.
The extracted data must be displayed in the interface for
review and modification.

Automated System for Audit Report Verification 42


Chapter 5. Sprint 3 – Data extraction

Postconditions The data is validated and ready for storage in the database.
If the data is found to be accurate, the Study Officer can
proceed to store it in the database.

Trigger The Study Officer clicks on the "Validate" button after re-
viewing the extracted data.
Main Flow
1. The Study Officer reviews the extracted data for ac-
curacy.

2. The Study Officer makes any necessary modifica-


tions.

3. The Study Officer clicks "Validate" to confirm the


data is accurate.

4. The system marks the data as validated.

Alternative Flows AF1: Data Requires Modification


If the Study Officer identifies errors or discrepancies, they
modify the data before validating it.
AF2: Invalid Data
If the Study Officer finds the data is incorrect and cannot
be validated, they can reject it and request further review.

Security Considera- Only authenticated Study Officers can validate the data.
tions The validation process ensures that only accurate and com-
plete data is stored in the database.

Automated System for Audit Report Verification 43


Chapter 5. Sprint 3 – Data extraction

Textual Description of "Database Population"

Table 5.4: Use Case: Database Population

Use Case ID UC3


Use Case Name Database Population
Actor(s) System, Study Officer
Description This use case allows the system to store validated data in the
database. Once the data is validated by the Study Officer, it
is saved into the database for future analysis and reporting.
This ensures that all audit-related data is securely stored and
easily retrievable.
Preconditions
• The data must be validated by the Study Officer.

• The Study Officer must have completed the data val-


idation process.

Postconditions
• The validated data is stored securely in the database.

• The data is now available for future analysis and re-


porting.

Trigger The Study Officer clicks on the "Save" button after validat-
ing the data.
Main Flow
1. The Study Officer clicks the "Save" button after data
validation.

2. The system stores the validated data in the database.

3. The system confirms that the data has been success-


fully saved.

Alternative Flows
• AF1: Database Error

– If an error occurs while saving the data, the


system notifies the Study Officer and requests
them to try again.

Security Considera-
tions
• Data is encrypted before being stored in the database
to protect sensitive information.

• The system ensures that only authorized users can


store data in the database.

Automated System for Audit Report Verification 44


Chapter 5. Sprint 3 – Data extraction

5.4 Sequence Diagram

e.png

Figure 5.2: Sequence Diagram - Sprint 3

5.5 Conception

5.5.1 Class Diagram

Automated System for Audit Report Verification 45


Chapter 5. Sprint 3 – Data extraction

s.png

Figure 5.3: Class Diagram - Sprint 3

5.5.2 Interaction Diagram

Automated System for Audit Report Verification 46


Chapter 5. Sprint 3 – Data extraction

on.png

Figure 5.4: Interaction Diagram - Sprint 3

5.6 Execution

Screenshots below illustrate the functionalities developed in this sprint.

5.6.1 Data Extraction

Automated System for Audit Report Verification 47


Chapter 5. Sprint 3 – Data extraction

Figure 5.5: Login Interface

5.6.2 Data Validation

Figure 5.6: Admin Dashboard

5.7 Conclusion

Sprint 3 successfully implemented the data extraction and database population module. The system
now extracts key information from audit reports, allowing the Study Officer to review and validate the
data before storing it in the database. This ensures data accuracy and integrity, while also preserving
historical audit data for future reference. The sprint has laid a solid foundation for subsequent data
processing and analysis phases, ensuring reliable and structured data for the next steps in the project.

Automated System for Audit Report Verification 48


CHAPTER 6

SPRINT 4 – DATA ANALYSIS AND DASHBOARD

Contents
6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
6.2 Sprint 4 Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
6.3 Functional Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
6.3.1 Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
6.4 Sequence Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
6.5 Conception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
6.5.1 Class Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
6.5.2 Interaction Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
6.6 Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
6.6.1 Statistics Calculation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
6.6.2 Statistics Calculation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
6.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

49
Chapter 6. Sprint 4 – Data Analysis and dashboard

6.1 Introduction

This sprint focuses on processing the data extracted from the audit reports and generating relevant
statistics, dashboards, and reports. The goal is to provide visual representations of the audit missions’
data to facilitate decision-making. The module allows the creation of both simple and complex in-
dicators, statistical analysis, and the generation of customized dashboards and reports. This ensures
that stakeholders can easily interpret the results of the audits and compare them across different time
periods and sectors.

6.2 Sprint 4 Backlog

Id Feature User Story Duration


(Days)
1 Data Visualization As a Study Officer, I want to view the 3
historical audit data for each organization
so that I can track and analyze audit
results over time.
2 Indicator Design As a Study Officer, I want to design both 4
simple and complex indicators based on
the audit data, so that I can generate
meaningful insights.
3 Statistics Calculation As a Study Officer, I want to calculate 5
statistics for specific indicators over a
chosen period to identify trends and
comparisons with other organizations.
4 Dashboard Generation As a Study Officer, I want to generate a 3
dashboard that includes all the calculated
statistics and associated graphs, so I can
monitor the results easily.
5 Report Generation As a Study Officer, I want to generate 4
reports based on dashboard templates, so
that I can share audit insights in PDF or
CSV formats.

Automated System for Audit Report Verification 50


Chapter 6. Sprint 4 – Data Analysis and dashboard

6.3 Functional Specification

6.3.1 Use Case Diagram

The figure below illustrates the different use cases related to data processing, statistics, and dash-
board/report generation.

Figure 6.1: Use Case Diagram – Sprint 4

Automated System for Audit Report Verification 51


Chapter 6. Sprint 4 – Data Analysis and dashboard

Textual Description of "Generate Dashboard"

Table 6.2: Use Case: Generate Dashboard

Use Case ID UC3.1


Use Case Name Generate Dashboard
Actor(s) Decision Maker
Description This use case allows the Decision Maker to generate dash-
boards that display key audit indicators. These dashboards
can be based on predefined templates and include data visu-
alizations and aggregated statistics.
Preconditions
• Relevant audit data must already be stored in the
database.

• The user must be authenticated and authorized.

Postconditions
• A dashboard is generated and displayed.

• The user can interact with or export the dashboard.

Trigger The Decision Maker selects a dashboard model and clicks


"Generate".
Main Flow
1. The user logs into the system.

2. The user selects a dashboard model.

3. The system queries the database and computes indi-


cators.

4. The dashboard with graphs and stats is generated.

5. The dashboard is displayed or exported.

Alternative Flows
• AF1: No Data Available

– The system notifies the user that no data is


available to generate the dashboard.

Security Considera-
tions
• Only authorized users can generate dashboards.

• Sensitive data in the dashboard is masked or aggre-


gated.

Automated System for Audit Report Verification 52


Chapter 6. Sprint 4 – Data Analysis and dashboard

Textual Description of "View Audit History"

Table 6.3: Use Case: View Audit History

Use Case ID UC3.1


Use Case Name View Audit History
Actor(s) Study Officer, Decision Maker
Description This use case allows the user to visualize the complete his-
tory of audit missions for a specific organization, including
all structured and validated data from the database.
Preconditions
• Audit data must be available in the database.

• The user must have appropriate access rights.

Postconditions
• The audit history is displayed to the user.

Trigger The user selects an organization and requests to view audit


history.
Main Flow
1. User selects the organization.

2. System retrieves all related audit records.

3. System displays the audit history.

Alternative Flows None


Security Considera-
tions
• Access limited to authorized users.

• Data is read-only.

Automated System for Audit Report Verification 53


Chapter 6. Sprint 4 – Data Analysis and dashboard

Textual Description of "Create Indicator"

Table 6.4: Use Case: Create Indicator

Use Case ID UC3.2


Use Case Name Create Indicator
Actor(s) Study Officer, Decision Maker
Description This use case allows users to create either a simple or com-
plex indicator based on one or more fields from the audit
database, using logical or arithmetic expressions.
Preconditions
• Database must contain structured data.

• User must be authenticated.

Postconditions
• The new indicator is saved and can be reused.

Trigger User chooses to define a new indicator.


Main Flow
1. User selects fields and logic.

2. System validates the expression.

3. System saves the new indicator.

Alternative Flows
• AF1: Invalid Expression – User is notified and
asked to correct it.

Security Considera-
tions
• Input validation to avoid injection attacks.

Automated System for Audit Report Verification 54


Chapter 6. Sprint 4 – Data Analysis and dashboard

Textual Description of "Generate Statistics"

Table 6.5: Use Case: Generate Statistics

Use Case ID UC3.3


Use Case Name Generate Statistics
Actor(s) Decision Maker
Description This use case enables the user to compute statistics for se-
lected indicators over specific periods or by filtering through
sectors or organizations.
Preconditions
• At least one indicator must exist.

Postconditions
• Statistics are calculated and ready for visualization.

Trigger User selects an indicator and period and clicks "Calculate".


Main Flow
1. User selects filters (indicator, period, sector).

2. System processes data.

3. Statistics are displayed.

Alternative Flows None


Security Considera-
tions
• Aggregation ensures individual data privacy.

Automated System for Audit Report Verification 55


Chapter 6. Sprint 4 – Data Analysis and dashboard

Textual Description of "Generate Report"

Table 6.6: Use Case: Generate Report

Use Case ID UC3.5


Use Case Name Generate Report
Actor(s) Decision Maker, Administrator
Description This use case allows the user to generate detailed audit re-
ports based on selected indicators or dashboards, which can
be exported in PDF or CSV formats.
Preconditions
• At least one dashboard or indicator must exist.

Postconditions
• A report is generated and exported.

Trigger The user requests report generation.


Main Flow
1. User selects content to include.

2. System generates the report.

3. Report is displayed and can be exported.

Alternative Flows
• AF1: Export Error – User is notified of the error.

Security Considera-
tions
• PDF reports are digitally signed.

• CSV exports are restricted to authorized roles.

6.4 Sequence Diagram

6.5 Conception

6.5.1 Class Diagram

Automated System for Audit Report Verification 56


Chapter 6. Sprint 4 – Data Analysis and dashboard

e.png

Figure 6.2: Sequence Diagram - Sprint 4

6.5.2 Interaction Diagram

Automated System for Audit Report Verification 57


Chapter 6. Sprint 4 – Data Analysis and dashboard

s.png

Figure 6.3: Class Diagram - Sprint 4

6.6 Execution

Screenshots below illustrate the functionalities developed in this sprint.

6.6.1 Statistics Calculation

Automated System for Audit Report Verification 58


Chapter 6. Sprint 4 – Data Analysis and dashboard

on.png

Figure 6.4: Interaction Diagram - Sprint 4

Figure 6.5: Login Interface

Automated System for Audit Report Verification 59


Chapter 6. Sprint 4 – Data Analysis and dashboard

Figure 6.6: Admin Dashboard

6.6.2 Statistics Calculation

6.7 Conclusion

Sprint 4 focused on processing and analyzing audit data, providing tools for data visualization, indi-
cator creation, statistical generation, and report export.

Automated System for Audit Report Verification 60


CHAPTER 7

GENERAL CONCLUSION

This work was carried out as part of our final year project at the National Cybersecurity Agency
(ANCS), in fulfillment of the requirements for the Bachelor’s degree in Computer Systems and Net-
work Engineering at the Higher Institute of Computer Science of Mahdia (ISIMA). The objective
of the project was to design and develop a web-based solution ( a web platform ) for the automated
analysis of information systems audit reports. In an increasingly digital world where cybersecurity
challenges are growing rapidly, this project responds to a real need: simplifying and improving the
way audit reports are processed, analyzed, and used to support decision-making. Our solution was
conceived as a modular and scalable application, developed progressively through 4 sprints following
the Scrum methodology. In Sprint 1, we focused on the compliance verification module. This module
allows any user—even without authentication—to check whether an audit report meets predefined
structural standards. We implemented an automated system that parses uploaded documents, detects
key expected sections, and flags any missing or malformed titles. This feature helps streamline the
early-stage review process and ensures standardization before further analysis. In Sprint 2, we ad-
dressed user access management and role-based rights. We implemented a secure login system and
defined four key user roles: Administrator, Study Officer, Audit Supervisor, and Decision-Maker.
Each role was assigned specific permissions and access levels within the platform. This sprint re-
quired careful backend planning and interface design to ensure both security and usability, while
respecting the principle of least privilege.

Sprint 3 was dedicated to data extraction. In this phase, we developed an interface for Study
Officers to open validated reports and extract relevant data fields, such as report dates, entities, and
findings. These fields are then displayed for review and validation before being securely stored in the
system’s database. This sprint emphasized precision, error handling, and user-friendliness, enabling
smooth and accurate information capture.

Finally, in Sprint 4, we developed the data processing and dashboard module. This component
provides Decision-Makers with a statistical view of the data extracted from audit reports. Through dy-

61
Chapter 7. General Conclusion

namic graphs, tables, and filters, users can generate real-time reports and gain insights into trends and
performance indicators. This sprint demonstrated the added value of data visualization for supporting
strategic decisions.

Throughout this journey, we acquired valuable experience—both technical and organizational.


We learned to manage our tasks incrementally, to adapt to feedback, and to implement practical
solutions using a range of technologies. From a technical perspective, the project was built using
modern technologies including Flask (Python) for backend development, React.js for the frontend,
and PostgreSQL for database management. We ensured data security through encryption and access
control mechanisms, in alignment with ANCS cybersecurity standards. Throughout the project, we
followed best practices in software engineering, from modular development to documentation and
testing.

This project has allowed us to apply the core skills gained throughout our academic path, from sys-
tem design to frontend/backend development, database management, and security principles. More
importantly, it offered us a chance to work in a professional context and contribute to a critical field
such as cybersecurity.

Of course, the project is not without its limitations. We recognize the potential to enhance it
further, for example, by integrating artificial intelligence for deeper document analysis, supporting
more diverse file formats, or improving the user experience.

In conclusion, this project represents a concrete outcome of our training as computer systems and
network engineers. It reflects our ability to design, implement, and deliver solutions that are both
functional and relevant to current industry needs. We are proud of what we have accomplished, and
we leave this experience with greater confidence, a stronger sense of responsibility, and the motivation
to pursue further challenges in the world of IT and cybersecurity.

Automated System for Audit Report Verification 62

You might also like