Sadt Notes
Sadt Notes
NOTES
1
Information System
An information system (IS) refers to a combination of people, processes, data, and
technology that work together to collect, store, process, analyze, and disseminate
information for supporting decision-making, coordination, control, and other
organizational activities. It encompasses the use of technology and resources to
manage and utilize information effectively within an organization.
Categories of Information Systems:-
1. Transaction Processing Systems (TPS): TPS capture and process transactional
data generated by daily business operations, such as sales, inventory, and payroll.
They focus on operational efficiency and data accuracy.
2
Information Strategies:-
1. Information Capture Strategy: This strategy focuses on collecting accurate
and relevant data from various sources, ensuring data quality and integrity
through proper validation and verification processes.
3
SDLC (Software Development Life Cycle)
SDLC stands for Software Development Life Cycle. It is a structured approach or
process used by software development teams to plan, design, develop, test, deploy,
and maintain software systems. SDLC provides a framework for managing the
entire software development process, from the initial concept to the final product.
Stages:-
1. Requirements Gathering: The development team gathers and documents the
requirements for the software. They interact with stakeholders, including clients
and end-users, to understand their needs, expectations, and desired
functionalities.
2. Design: Once the requirements are gathered, the team proceeds to design the
software. This stage involves creating a blueprint or plan that outlines how the
software will be structured, including its architecture, user interface design, and
database design.
5. Deployment: Once the software passes all the tests and is deemed ready for
release, it is deployed or installed in the production environment. This stage
involves setting up the necessary infrastructure, configuring the software, and
making it available for end-users to access and utilize.
System Analyst
A system analyst plays an important role in the development and implementation of
information systems within an organization. They bridge the gap between business
needs and technology by understanding the requirements of stakeholders and
translating them into functional solutions.
Role and Responsibilities of a System Analyst:-
1. Requirements Gathering: Systems analysts interact with stakeholders, such as
users, managers, and executives, to understand their needs and objectives. They
collect and analyze information about existing processes, identify main points,
and determine the requirements for new or improved systems.
5. User Support and Training: After the system is implemented, systems analysts
provide support to end-users by assisting with troubleshooting, addressing user
queries, and providing training to ensure smooth adoption and efficient
utilization of the system.
5
Requirement gathering techniques
Requirement gathering techniques are methods used by software development
teams to gather information about the desired functionality, features, and constraints
of a software system from stakeholders and end-users or customers.
There are three common requirement gathering techniques:
Interviews: Interviews involve direct interaction between the project team and
stakeholders or end-users. In this technique, team members ask specific questions
to gather detailed information about the requirements. Interviews can be
conducted individually or in a group setting. It allows for in-depth discussions,
clarification of requirements, and the opportunity to explore additional details
based on the responses provided.
6
Provide an overview of the system and its purpose.
Describe the key objectives and goals of the system.
2. Functional Requirements:-
Define the specific features and functionalities that the system should possess.
Clearly state the expected behaviour and actions of the system.
Specify any input data required and the corresponding output results.
3. Non-functional Requirements:-
Outline the qualities and constraints that the system should meet.
Include performance requirements (response time, scalability), usability,
reliability, security, and compatibility with other systems or platforms.
4. User Interface:
5. System Interfaces:-
Identify any external systems, databases, or APIs that the system needs to interact
with.
Define the methods, protocols, and formats for data exchange.
6. Data Requirements:-
Define the data entities and their attributes that the system will handle.
Specify any data validation, storage, or retrieval requirements.
7. Operational Requirements:-
8. Documentation Requirements:-
Documentation standards and guidelines.
7
User manuals, technical guides, or other documentation to be produced.
9. Project Constraints:-
Timeframe and deadlines for system development and implementation.
Budget constraints or cost considerations.
Resource limitations (personnel, infrastructure, etc.)
Feasibility Study
A feasibility study is an assessment conducted to determine whether a proposed
project, venture, or idea is viable, practical, and worth pursuing. It helps decision-
makers make informed choices by considering the advantages, disadvantages, risks,
and costs associated with a particular project.
1. Technical Feasibility:-
Technical feasibility assesses whether the proposed project can be developed or
implemented using the available technology, infrastructure, and resources. It
examines factors such as software and hardware requirements, technical expertise,
and compatibility with existing systems. This study helps determine if the project is
technically achievable within the given constraints.
2. Economic Feasibility:-
Economic feasibility evaluates the financial viability of the project. It analyzes the
costs and benefits associated with the project, including investment requirements,
operating expenses, potential revenue or savings, and return on investment (ROI).
The study aims to determine if the project is financially viable and if it can generate
sufficient profits or cost savings to justify the investment.
3. Operational Feasibility:-
Operational feasibility focuses on assessing whether the proposed project aligns
with the organization's existing operations, processes, and resources. It examines
factors such as staffing requirements, skill sets, training needs, and potential impact
on day-to-day operations. This study helps determine if the project can be smoothly
integrated into the organization's existing structure and if it will be accepted and
supported by the people involved.
4. Schedule Feasibility:-
8
It evaluates the project timeline and determines whether the proposed project can
be completed within the desired time frame. It considers factors such as resource
availability, project dependencies, critical path analysis, and potential risks that
could affect the project schedule. The study helps in identifying any scheduling
constraints or challenges that may impact the project's successful implementation.
9
Change Management: During the development process, changes may arise.
Cost/benefit analysis helps assess the impact of proposed changes to the project's
scope, schedule, and budget.
Coupling:-
It refers to the level of dependency between different software components or
modules. It measures how much one component relies on another.
Low coupling means that the components are relatively independent of each other,
while high coupling means that they are tightly interconnected and heavily rely on
one another.
Low coupling is desirable because it reduces the ripple effects of changes made to
one component, making the system more flexible and easier to modify without
affecting other parts.
10
Entity Relationship Model
ER model stands for an Entity-Relationship model. It is a high-level data model.
This model is used to define the data elements and relationship for a specified
system. It develops a conceptual design for the database. It also develops a very
simple and easy to design view of data. It uses entities, attributes, and relationships
to describe the structure and behaviour of a database system.
Weak Entity: An entity that depends on another entity called a weak entity. The
weak entity doesn't contain any key attribute of its own. The weak entity is
represented by a double rectangle.
11
Strong Entity: An entity that exists independently and has its own unique key
attribute. It can exist on its own without being dependent on any other entity.
Example id, age, contact number, name, etc. can be attributes of a student.
Key Attribute:-
Composite Attribute:-
An attribute that composed of many other attributes is known as a composite
attribute. The composite attribute is represented by an ellipse, and those ellipses are
connected with an ellipse.
12
Multivalued Attribute:-
An attribute can have more than one value. These attributes are known as a
multivalued attribute. The double oval is used to represent multivalued
attribute.
Example a student can have more than one phone number.
Derived Attribute:-
An attribute that can be derived from other attribute is known as a derived attribute.
It can be represented by a dashed ellipse.
Example A person's age changes over time and can be derived from another
attribute like Date of birth.
2. One-to-Many: An entity from one set can be associated with multiple entities in
the other set, but an entity from the other set can be associated with only one
entity in the first set.
Example Scientist can invent many inventions, but the invention is done by the
only specific scientist.
4. Many-to-One: When more than one instance of the entity on the left, and only
one instance of an entity on the right associates with the relationship.
Example Student enrolls for only one course, but a course can have many
students.
14
Participation describes whether a record in one table must participate in a
relationship with a related record in another table, either mandatorily or
optionally.
OOM (Object Oriented Modeling)
Object-oriented modeling is a way of designing and representing systems or
problems using objects and their interactions. It is a popular approach in software
development and other fields where complex systems need to be analyzed and
designed.
We break down a system into individual objects, which are representations of real-
world or abstract entities. Each object has its own set of properties (attributes) and
behaviors (methods). Objects interact with each other by sending messages, which
trigger specific behaviors or actions.
Some important types used in OOM:
1. Class: A class is a blueprint or template that defines the attributes and methods
that objects of the class will possess. It represents a category of objects with
similar properties and behaviors.
Dynamic modelling:-
It focuses on the behavior and functionality of the system. It captures the system's
dynamic aspects, such as the sequence of events, interactions, and state transitions
that occur over time. Dynamic models depict the system's behavior in response to
various events or actions. It helps in understanding how the system functions, the
flow of information, and the processes involved. (It focuses on the system's
behavior and how it responds to events or actions.)
Example In a banking system, dynamic modeling would illustrate how a
customer interacts with the system to perform actions like depositing money,
withdrawing funds, or transferring funds between accounts.
16
There are various UML Diagrams:-
1. Use case Diagram
2. Class Diagram
3. Sequence Diagram
4. Activity Diagram
5. State Diagram
6. Object Diagram
7. Package Diagram
8. Deployment Diagram
9. Interaction Diagram
Software Testing
Software testing is the process of evaluating a software application or system to
ensure that it meets its intended requirements and functions correctly. By
identifying and fixing defects early on, it reduces the likelihood of issues arising in
production, thereby improving the quality, reliability, and usability of the software.
The objective is to identify and fix any bugs or problems before the software is
deployed or made available to users.
There are two types of software testing:-
1. Manual Testing
2. Automation Testing
17
It is software testing technique where the tester examines the software application
without having knowledge of its internal structure, implementation details, or code.
It focuses on testing the functionality of the software based on the specified
requirements, without considering how the software achieves its results. It is also
known as behavioural testing and functional testing.
Functional Testing:-
It focuses on verifying that the software application or system functions correctly
and performs its intended tasks according to the specified requirements. It ensures
that the software meets the functional aspects or features expected by the users.
Functional Testing Techniques:-
1. Unit Testing:-
It is a type of functional testing that focuses on testing individual units or
components of software in isolation. The goal of unit testing is to verify the
correctness of each unit and ensure that it functions as intended.
18
Purpose: It aims to test the smallest testable parts of the software, such as
functions, methods, or classes. It helps identify defects or errors in the code at
an early stage.
Isolation: Unit tests are designed to be independent and isolated from other units
and external dependencies.
Automation: Unit testing is usually automated, which means that test cases are
written as code and executed automatically.
2. Integration Testing:-
It is a software testing technique that focuses on testing the integration and
interaction between different components or modules of a software system. It aims
to identify any defects or issues that may arise when these components are combined
and working together as a whole.
Verifying Interactions: It ensures that the components or modules of the
software interact correctly with each other.
19
Alpha Testing: It is conducted within the development team or a dedicated
group of testers. It is performed before the software is released to external users
or customers. The main goal is to identify and resolve any major issues, validate
the software against the intended requirements, and gather feedback for
improvements.
Non-Functional Testing:-
2. Security Testing: Security testing checks the software for vulnerabilities and
potential risks to ensure that sensitive data is protected and unauthorized access
is prevented.
3. Load Testing: Load testing involves testing the system's loading capacity.
Loading capacity means more and more people can work on the system
simultaneously. During load testing, various metrics are measured, such as
response time, throughput and resource utilization.
20
4. Stress Testing: Stress testing is like pushing the software to its limits to see how
it handles intense or unusual conditions. It involves simulating heavy workloads,
large data volumes, or excessive user activity to measure how the system
performs and whether it can handle the stress without crashing or exhibiting
undesirable behaviour.
Maintenance Testing:-
Maintenance testing is a type of software testing that is performed after a software
application or system has been deployed or released into production. It focuses on
verifying and validating the software's behaviour, functionality, and performance
after it has undergone changes, updates, or maintenance activities.
When software undergoes maintenance, such as bug fixes, enhancements, or
updates, there is a need to ensure that the changes made do not introduce new issues
or regressions. Maintenance testing helps in validating the correctness and stability
of the software after these modifications.
3. Immediate benefits: Users can start leveraging the advantages of the new
technology immediately after the conversion.
2. Incremental learning: Users can gradually adapt to the new system, reducing
the impact of a sudden change.
22
1. Reduced downtime: The existing system remains operational during the
parallel run, minimizing business disruptions.
3. Smooth transition: Users can gradually shift to the new system, ensuring a
smoother transition and minimizing resistance to change.
2. Early feedback: Users involved in the pilot phase can provide feedback,
allowing for improvements and adjustments before wider adoption.
3. Risk mitigation: Any issues or challenges encountered during the pilot phase
can be addressed before rolling out the conversion to the entire organization.
Verification Validation
23
Verification Validation
8. It consists of checking of
H. It consists of execution of program
documents/files and is performed
and is performed by computer.
by human.
24
Prototype Model
A prototype model is a software development approach that involves creating an
early, simplified version of a product to gather feedback, test concepts, and validate
requirements. Prototyping Model is used when the customers do not know the exact
project requirements beforehand. In this model, a prototype of the end product is
first developed, tested and refined as per customer feedback repeatedly till a final
acceptable prototype is achieved which forms the basis for developing the final
product.
Advantages:-
1. This model is flexible in design.
2. It is easy to detect errors.
3. We can find missing functionality easily.
4. It can be reused by the developer for more complicated projects in the future.
5. It ensures a greater level of customer satisfaction and comfort.
6. It is ideal for online system.
7. It helps developers and users both understand the system better.
8. It can actively involve users in the development phase.
Disadvantages:-
1. This model is costly.
2. It has poor documentation because of continuously changing customer
requirements.
3. There may be too much variation in requirements.
4. Customers sometimes demand the actual product to be delivered soon after seeing
an early prototype.
25
5. There may be sub-optimal solutions because of developers in a hurry to build
prototypes.
6. Customers may not be satisfied or interested in the product after seeing the initial
prototype.
7. There is certainty in determining the number of iterations.
8. There may increase the complexity of the system.
26
Time and Cost Efficiency: JAD reduces the time required for requirements
gathering and design phases as it promotes focused and intense collaboration.
This efficiency can lead to cost savings by preventing rework.
27
Phases of RAD Model:-
Advantages:-
Disadvantages:-
28
Waterfall Model
The Waterfall model is a linear and sequential software development methodology
that follows a predefined set of steps in a specific order. It is called the "Waterfall"
model because the process flows downwards, like a waterfall, with each phase
cascading into the next.
2. System Design: The overall system architecture and design are created based on
the requirements gathered in the previous phase. The system design specifies how
different components and modules will work together, and outlines the data
structures, interfaces, and alsgorithms needed.
29
5. Deployment: Once the software has been thoroughly tested and approved, it is
deployed or released to the end-users or clients. This phase involves installation,
configuration, and setting up the software in the production environment.
Advantages:-
1. This model is simple to implement also the number of resources that are required
for it is minimal.
2. The requirements are simple and explicitly declared; they remain unchanged
during the entire project development.
3. The start and end points for each phase is fixed, which makes it easy to cover
progress.
4. The release date for the complete product, as well as its final cost, can be
determined before development.
5. It gives easy to control and clarity for the customer due to a strict reporting
system.
Disadvantages:-
1. In this model, the risk factor is higher, so this model is not suitable for more
significant and complex projects.
2. This model cannot accept the changes in requirements during development.
3. It becomes tough to go back to the phase. For example, if the application has
now shifted to the coding phase, and there is a change in requirement, It becomes
tough to go back and change it.
4. Since the testing done at a later stage, it does not allow identifying the challenges
and risks in the earlier phase, so the risk reduction strategy is difficult to prepare.
30
Distributed System
It is collection of independent computers or nodes connected through a network.
These nodes work together to perform a common task or provide a unified service.
All the nodes in this system communicate with each other and handle processes in
tandem. Each of these nodes contains a small part of the distributed operating
system software.
Example:-
A distributed system is a web-based e-commerce platform like Amazon. The
platform consists of multiple servers distributed across different locations. When a
user visits the website, their request is processed by a distributed system, where
different servers work together to handle tasks. This distributed architecture allows
Amazon to handle a large number of users, provide scalability, and ensure fault
tolerance.
Advantages:-
1. Scalability: Distributed systems can scale horizontally by adding more nodes to
the network, allowing them to handle increased workloads and accommodate a
growing user base. This makes them suitable for applications that require high
scalability, such as social media platforms or cloud-based services.
31
capabilities, reducing the time required to perform complex computations or
process large amounts of data.
Centralized System
A centralized system refers to a computing or organizational structure where control
and decision-making authority are concentrated in a single central location or entity.
In this system, all data, resources, and decision-making processes are managed and
controlled by a central authority.
Example:-
A centralized system is a traditional banking system. In this system, a central bank
oversees and controls all banking operations, including account management,
32
transactions, and customer services. All customer data, financial records, and
decision-making processes are handled by the central bank, and individual branch
offices operate under the centralized control and supervision.
Advantages:-
1. Efficient Management: In a centralized system, a single authority can efficiently
manage and coordinate all operations. This centralized control enables
streamlined decision-making, resource allocation, and enforcement of policies. It
simplifies management processes and reduces duplication of efforts.
33
2. Lack of Local Autonomy: Centralized systems can limit the autonomy of
individual units or branches. Decision-making power and flexibility may be
restricted, as all important decisions are made by the central authority. This can
lead to delays in responding to local needs or changes in the market.
34
DFD (Data Flow Diagram)
A Data Flow Diagram (DFD) is a graphical representation of how data flows
through a system or a process. It helps to visualize how information moves from
one part of the system to another. It consists of various components, such as
processes, data stores, and data flows.
Processes: These are the actions or operations that are performed on the data.
Processes are represented by circles or ovals. Each process takes inputs, does
some processing, and produces outputs.
Data Stores: These are places where data is stored or retrieved from. Data stores
are represented by rectangles. They can be physical, like a database or a file
cabinet, or virtual, like a temporary memory location.
Data Flows: These are arrows that represent the movement of data between
processes and data stores. Data flows show how information is transmitted from
one point to another within the system.
Levels of DFD
Data Flow Diagrams (DFDs) are used to represent how data flows through a system
or process. DFDs consist of different levels that help in progressively detailing the
data flow and system components.
1. Level 0 DFD (Context Diagram):-
The Level 0 DFD, also known as the Context Diagram, provides an overview of
the entire system.
It represents the highest level of abstraction and shows the system as a single
process or function, surrounded by its external entities (sources or sinks of data).
The main purpose of the Context Diagram is to show how the system interacts
with its environment without diving into internal details.
35
2. Level 1 DFD:-
The Level 1 DFD represents the major processes or sub-systems identified in the
Context Diagram.
It breaks down the single process of the Context Diagram into more manageable
and understandable sub-processes.
Level 1 DFD provides a clear picture of how data flows between the main
processes, data stores, and external entities.
3. Level 2 DFD:-
The Level 2 DFD takes each sub-process from the Level 1 DFD and further
decomposes it into more detailed sub-processes.
It shows a more refined view of the system, breaking down the processes into
smaller, manageable parts.
Level 2 DFDs provide more insights into how data moves within each sub-
process and how data stores are used.
36
4. Level 3 DFD (and so on):-
Depending on the complexity of the system, you can continue to create lower-
level DFDs like Level 3, Level 4, and so on.
Each subsequent level breaks down processes from the higher-level DFD into
more detailed and granular sub-processes.
These lower levels provide a step-by-step understanding of data flow and process
interactions.
Application Architecture
Application architecture refers to the way software applications are structured and
organized. It defines the components, their interactions, and the overall layout of
the application. A good application architecture ensures that the software is reliable,
scalable, maintainable, and meets its intended requirements.
1. Server-Based Architecture:-
In a server-based architecture, most of the processing and data management tasks
are performed on the server-side. The clients (users' devices, like computers or
smartphones) primarily act as display terminals and interact with the server to send
requests and receive responses. This type of architecture is common in web
applications, where the server handles the core logic and data processing, and the
client devices show the results.
2. Client-Based Architecture:-
In a client-based architecture, most of the processing and data management tasks
are performed on the client-side. The server primarily serves as a data store or a
source of information. The client devices handle the core logic and data processing,
reducing the need for continuous communication with the server. This architecture
is commonly used in desktop applications and mobile apps.
3. N-Tier Architecture:-
N-tier architecture, also known as multi-tier architecture, divides the software
application into multiple layers or tiers, each responsible for different aspects of the
application.
37
The most common n-tier architecture consists of three tiers:-
Presentation Tier (or Client Tier): This is the user interface layer where the
client interacts with the application.
Application Tier (or Business Logic Tier): This layer contains the business
logic and processing rules of the application.
Data Tier (or Data Storage Tier): This layer deals with data storage and
retrieval, interacting with databases or other data sources.
38