Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
14 views20 pages

Software Engineering DSC-C-BCA-241T - Unit - II

The document outlines key design concepts in software engineering, including abstraction, modularity, encapsulation, and reusability, which serve as foundational principles for creating efficient and maintainable software systems. It also discusses various design models and architectural styles, emphasizing the importance of a well-structured design for scalability, performance, and user-centric design. Additionally, it highlights the significance of architectural patterns and the challenges associated with architectural design, advocating for careful planning and collaboration among stakeholders.

Uploaded by

kamalahir474
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views20 pages

Software Engineering DSC-C-BCA-241T - Unit - II

The document outlines key design concepts in software engineering, including abstraction, modularity, encapsulation, and reusability, which serve as foundational principles for creating efficient and maintainable software systems. It also discusses various design models and architectural styles, emphasizing the importance of a well-structured design for scalability, performance, and user-centric design. Additionally, it highlights the significance of architectural patterns and the challenges associated with architectural design, advocating for careful planning and collaboration among stakeholders.

Uploaded by

kamalahir474
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

UNIT-II

Design Concepts

Design Models:

 Architectural Styles
 Architectural Design
 Assessing Alternative Architectural Designs
 Architectural Mapping Using Data Flow

User Interface Design:

 Golden Rules of Interface Design


 User Interface Analysis and Design
 Interface Analysis
 Interface Design steps
Design Concepts

Design concepts in software engineering are fundamental principles and guidelines that help in
creating software systems that are efficient, maintainable, scalable, and robust. These concepts serve
as the foundation for software design and development processes. Below are some key design
concepts:

1. Abstraction (Hide Irrelevant data): Hiding complex implementation details and exposing only
essential features. It simplifies design and enhances maintainability by reducing complexity. For
Example: Using interfaces or abstract classes in object-oriented programming to define common
behavior without revealing implementation. Here, there are two common abstraction mechanisms:

 Functional Abstraction: A module is specified by the method it performs. The details of the
algorithm to accomplish the functions are not visible to the user of the function. Functional
abstraction forms the basis for Function oriented design approaches.
 Data Abstraction: Details of the data elements are not visible to the users of data. Data
Abstraction forms the basis for Object Oriented design approaches.

2. Modularity: Dividing a software system into smaller, self-contained units or modules. It encourages
separation of concerns, easier testing, parallel development, and reuse of code. For example: Micro
services architecture, where each service handles a specific business function independently.

The desirable properties of a modular system are:

 Each module is a well-defined system that can be used with other applications.
 Each module has single specified objectives.
 Modules can be separately compiled and saved in the library.
 Modules should be easier to use than to build.
 Modules are simpler from outside than inside.
3. Encapsulation: Restricting direct access to some of an object's components, typically by using
private fields and public methods. Protects the integrity of data and ensures controlled interaction
with objects. For Example: Getter and setter methods in classes.

4. Separation of Concerns: Dividing the software into distinct sections, each handling a specific aspect
or functionality. It simplifies development, testing, and maintenance by focusing on independent
parts. For example: The Model-View-Controller (MVC) architectural pattern separates data handling,
user interface, and control logic.

5. Reusability: Designing components in a way that they can be reused in different applications or
scenarios. It reduces development time and improves consistency. For example: Creating shared utility
libraries or using design patterns like Singleton or Factory.

6. Scalability: Ensuring the system can handle increased load (more users, data, or transactions)
without degradation of performance. It prepares the system for growth and future demands. For
example: Using load balancers and database sharing in distributed systems.

7. Coupling and Cohesion:

 Coupling: The degree of dependency between components. It’s aim for low coupling to
minimize interdependence. For example: Loose coupling in micro services. In software
engineering, the coupling is the degree of interdependence between software modules. Two
modules that are tightly coupled are strongly dependent on each other. However, two
modules that are loosely coupled are not dependent on each other. Uncoupled modules have
no interdependence at all within them.

The various types of coupling techniques:

Types of Module Coupling:


1. No Direct Coupling: There is no direct coupling between M1 and M2.
2. Data Coupling: When data of one module is passed to another module, this is called data
coupling.
3. Stamp Coupling: Two modules are stamp coupled if they communicate using composite
data items such as structure, objects, etc. When the module passes non-global data
structure or entire structure to another module, they are said to be stamp coupled. For
example, passing structure variable in C or object in C++ language to a module.
4. Control Coupling: Control Coupling exists among two modules if data from one module is
used to direct the structure of instruction execution in another.
5. External Coupling: External Coupling arises when two modules share an externally
imposed data format, communication protocols, or device interface. This is related to
communication to external tools and devices.
6. Common Coupling: Two modules are common coupled if they share information through
some global data items.
7. Content Coupling: Content Coupling exists among two modules if they share code, e.g., a
branch from one module into another module.

 Cohesion: The degree to which the components of a module belong together. It’s aim for high
cohesion to make modules more focused and maintainable. For example: A class with
methods that only relate to its primary purpose. Cohesion is an ordinal type of measurement
and is generally described as "high cohesion" or "low cohesion."

Types of Modules Cohesion:


1. Functional Cohesion: Functional Cohesion is said to exist if the different elements
of a module, cooperate to achieve a single function.
2. Sequential Cohesion: A module is said to possess sequential cohesion if the
element of a module form the components of the sequence, where the output
from one component of the sequence is input to the next.
3. Communicational Cohesion: A module is said to have communicational cohesion,
if all tasks of the module refer to or update the same data structure, e.g., the set
of functions defined on an array or a stack.
4. Procedural Cohesion: A module is said to be procedural cohesion if the set of
purpose of the module are all parts of a procedure in which particular sequence
of steps has to be carried out for achieving a goal, e.g., the algorithm for decoding
a message.
5. Temporal Cohesion: When a module includes functions that are associated by the
fact that all the methods must be executed in the same time, the module is said
to exhibit temporal cohesion.
6. Logical Cohesion: A module is said to be logically cohesive if all the elements of
the module perform a similar operation. For example Error handling, data input
and data output, etc.
7. Coincidental Cohesion: A module is said to have coincidental cohesion if it
performs a set of tasks that are associated with each other very loosely, if at all.

Difference between Coupling and Cohesion:

Aspect Cohesion Coupling


Definition Cohesion refers to the degree to Coupling refers to the degree of
which elements within a module interdependence between software
work together to fulfil a single,modules. High coupling means that modules
well-defined purpose. are closely connected and changes in one
module may affect other modules.
Module Cohesion is the concept of an Coupling is the concept of inter-module.
Dependency intro-module.
Purpose Cohesion represents the Coupling represents the relationships
relationship within a module. between modules.
Quality Increasing cohesion is good for Increasing coupling is avoided for software.
software.
Focus Cohesion represents the Coupling represents the independence
functional strength of modules. among modules.
Relationship Highly cohesive gives the best Whereas loosely coupling gives the best
software. software.
Example In cohesion, the module focuses In coupling, modules are connected to the
on a single thing. other modules.
Creation Cohesion is created between the Coupling is created between two different
same modules. modules.

8. Design for Change: Building systems that can adapt to evolving requirements or technologies with
minimal effort. It reduces long-term maintenance costs. For example: Using dependency injection to
easily swap components or services.

9. DRY (Don't Repeat Yourself): Avoiding redundancy by ensuring that a piece of knowledge is
represented only once in the system. It prevents inconsistency and reduces the effort required for
updates. For example: Abstracting repetitive logic into utility methods or reusable classes.

10. Keep It Simple, Stupid: Designing systems to be as simple as possible, avoiding unnecessary
complexity. It enhances understanding, reduces bugs, and makes maintenance easier. For example:
Using straightforward algorithms instead of over-engineered solutions.

11. SOLID Principles: A set of five principles for object-oriented design:

 Single Responsibility Principle (SRP): A class should have only one reason to change.
 Open/Closed Principle (OCP): Software entities should be open for extension but closed for
modification.
 Liskov Substitution Principle (LSP): Subtypes should be substitutable for their base types.
 Interface Segregation Principle (ISP): Clients should not be forced to depend on interfaces
they do not use.
 Dependency Inversion Principle (DIP): High-level modules should not depend on low-level
modules; both should depend on abstractions.

12. Patterns and Best Practices:

 Design Patterns: Proven solutions to common design problems (e.g., Singleton, Factory, and
Observer).
 Best Practices: Guidelines to improve design quality (e.g., version control, code reviews).

13. Usability and User-Centric Design: Focusing on the end user’s needs, ensuring the system is easy
to use and understand. It enhances user satisfaction and adoption. For example: Designing intuitive
user interfaces and seamless workflows.

14. Performance and Efficiency: Designing systems to optimize resource utilization (CPU, memory,
network). It ensures fast response times and lower operational costs. For example: Caching frequently
accessed data.

15. Security by Design: Incorporating security principles throughout the design process. It protects
data and systems from unauthorized access or vulnerabilities. For example: Using encryption,
authentication, and access control mechanisms.
Design Models: The design model in software engineering is a blueprint that represents the
architecture, components, interfaces, and data for a system. It serves as a bridge between the
requirements specification and the implementation phase in software development.

Key Aspects of the Design Model: The design model is typically divided into several components:

 Architectural Design: It defines the overall structure of the system, identifies major
components (modules, subsystems) and their relationships. Example: Layered architecture,
Micro services, MVC (Model-View-Controller).
 Component-Level Design: It focuses on designing individual software components, specifies
how classes, objects, functions, and modules interact. Example: Class diagrams in UML.
 Data Design: It defines how data is structured, stored, and accessed, includes database design,
data models, and data flow diagrams. Example: ER diagrams, relational models.
 Interface Design: It specifies how the system interacts with users and other systems, includes
UI/UX design, API design, and communication protocols. Example: Wireframes, RESTful API
specifications.
 Behavioral Design: It describes how the system behaves in response to events, includes state
transitions, workflows, and system behavior models. Example: Sequence diagrams, state
machines.

Types of Design Models

 Structural Models – Focus on the organization of system components (e.g., Class Diagrams,
ER Diagrams).
 Behavioral Models – Represent how the system behaves over time (e.g., Use Case Diagrams,
Sequence Diagrams).
 Interaction Models – Show how different entities interact (e.g., Communication Diagrams).

Importance of the Design Model

 Ensures scalability and maintainability.


 Helps in better planning and resource allocation.
 Provides a clear understanding for developers and stakeholders.
 Supports reuse of components and design patterns.
 Helps in identifying potential issues before implementation.

Architectural Styles

Different architectural styles create a framework for all parts of the system. Software designed for
computer systems can have different designs or architectures. Each architectural style describes a
system category that encompasses:

 A set of components
 A set of connectors that enables “communication and coordination”
 Constraints that define how components can be integrated to form the system.
 Semantic models to understand the overall properties of a system.
Data Centered
Architectures

Call and
Dataflow
Return
Architectures
Architectures

Architectural
Styles

Object-
Layered
Oriented
Architectures
Architectures

Data Centered Architectures: A data store will be at the heart of this system and will be viewed often
by other parts that update, add, remove, or change the data in the store. The data-centered
architecture promotes integrability i.e. the existing components can be changed and a new client
component can be added to the architecture without the permission or concern of other clients.

Dataflow Architectures: This type of design is used to change input data into output data using a
number of computing parts. The figure represents pipe-and-filter architecture since it uses both pipe
and filter and it has a set of components called filters connected by lines. Pipes are used to transmitting
data from one component to the next. Each filter will work independently and is designed to take data
input of a certain form and produces data output to the next filter of a specified form. The filters don’t
require any knowledge of the working of neighboring filters.
If the data flow degenerates into a single line of transforms, then it is termed as batch sequential. This
structure accepts the batch of data and then applies a series of sequential components to transform
it.

Call and Return Architectures: It is used to create a program that is easy to scale and modify. A number
of substyles exist within this category:

Main program/sub program architectures: This classic program structure decomposes function into
a control hierarchy where a main program invokes a number of program components, which in turn
may invoke still other components.

Remote procedure call architectures: The components of a main program/ subprogram architecture
are distributed across multiple computers on a network.

Object-Oriented Architectures: The components of a system encapsulate data and the operations
that must be applied to manipulate the data. Communication and coordination between components
is accomplished via message passing.

It enables the designer to separate a challenge into a collection of autonomous objects. Other objects
are aware of the implementation details of the object, allowing changes to be made without having
an impact on other objects.

Layered Architectures: A number of different layers are defined with each layer performing a well-
defined set of operations. Each layer will do some operations that becomes closer to machine
instruction set progressively. At the outer layer, components will receive the user interface operations
and at the inner layers, components will perform the operating system interfacing (communication
and coordination with OS). Intermediate layers provide utility services and application software
functions.
Architectural Patterns

Architectural patterns in software engineering provide reusable solutions to common design problems
in software architecture. They help structure applications efficiently, ensuring scalability,
maintainability, and performance.

Organization and Refinement

The refinement of architectural styles is an ongoing process driven by scalability, performance, and
maintainability requirements. While organizational styles provide a base, their evolution ensures they
remain relevant to modern software challenges. The choice of architecture should align with business
goals, technical constraints, and future growth plans.

Architectural Design

Architectural design is a foundational aspect of software engineering, serving as the blueprint for
building robust, scalable, and maintainable software systems. By establishing a clear and well-thought-
out architecture, software architects ensure that the system meets both current and future
requirements effectively. A well-designed software architecture provides numerous benefits,
including improved maintainability, enhanced scalability, better performance, increased flexibility,
higher security, and cost efficiency.

It facilitates effective communication among stakeholders, supports scalability and performance


optimization, and ensures that the system is reliable and secure. Furthermore, architectural design
promotes reusability, modularity, and extensibility, making it easier to adapt to changing
requirements and integrate new features. However, architectural design also presents challenges such
as increased initial complexity, higher upfront costs, and the potential for over-engineering. Balancing
these challenges with the benefits requires careful planning, continuous improvement, and effective
collaboration among all stakeholders.

Representing the System in Context: Architectural Context Diagram is a graphic representation of the
system and of the external components that interact with the system. This components are linked to
the system via interfaces, illustrate by a rectangle above the system, like in this image:
There are 4 elements in ACD:

1. Superiors Systems - systems which uses the main system for realize your functions
2. Subordinated Systems - system which are used by main system for it realize your functions
3. Peers - systems (components) in the same level of the main system
4. Actors - external entities which uses the main system producing and consuming data

Defining Archetypes: An archetype is a class or pattern that represents a core abstraction that is
critical to the design of an architecture for the target system. Archetypes can be derived by examining
the analysis classes defined as part of the requirements model. For example, might define the
following archetypes:

1. Node: It represents the cohesive collection of input and output elements. For example:
Various sensors and a variety of alarm indicators.
2. Detector: It is an abstraction that encompasses all sensing equipment that feeds information
into the target system.
3. Indicator: It is an abstraction that represents all mechanisms (like alarm sirens, flashing lights,
bells etc.) for indicating that an alarm condition is occurring.
4. Controller: It is an abstraction that depicts the mechanism that allows the arming or disarming
of a node. If controllers reside on a network, they have the ability to communicate with one
another as shown in the UML notation.
Refining the Architecture into Components: Software architecture is refined into components.
The components are derived from analysis class within application domain. Also, many
infrastructure components are derived apart from application domain components. For example
Memory management component. For example, based on the functionality, the following
components are derived from SafeHome home security function:

1. External communication management: Coordinates communication of the security


function with external entities such as other internet-based systems and external alarm
notification.
2. Control panel process: Manages all control panel functionality.
3. Detector management: Coordinates access to all detectors attached to the system.
4. Alarm processing: Verifies and acts of all alarm conditions.

Describing Instantiations of the System: Architecture design at the high level contains the following
details: archetype, overall structure of the system, major system components. To further refine the
architecture, the components are elaborated to show additional details.

• Design class would be defined after component level design.

As the software architecture is refined into components, the structure of the system begins to
emerge. The architecture must accommodate many infrastructure components that enable
application components but have no business connection to the application domain. For example,
memory management components, communication components, database components, and
task management components are often integrated into the software architecture.

The overall architecture structural is represented as an UML component diagram:


 Assessing Alternative Architectural Designs: Design results in a number of architectural
alternatives that are each assessed to determine which is the most appropriate for the problem
to be solved. Two different approaches for the assessment of alternative architectural designs.
The first method uses an iterative method to assess design trade-offs. The second approach
applies a pseudo-quantitative technique for assessing design quality.

An Architecture Trade-Off Analysis Method (ATAM): The Software Engineering Institute (SEI) has
developed an architecture trade-off analysis method (ATAM) that establishes an iterative
evaluation process for software architectures. The design analysis activities that follow are
performed iteratively:
1. Collect scenarios. A set of use cases is developed to represent the system from the user’s
point of view.
2. Elicit requirements, constraints, and environment description. This information is
determined as part of requirements engineering and is used to be certain that all
stakeholder concerns have been addressed.
3. Describe the architectural styles/patterns that have been chosen to address the scenarios
and requirements. The architectural style(s) should be described using one of the
following architectural views:
I. Module view for analysis of work assignments with components and the degree
to which information hiding has been achieved.
II. Process view for analysis of system performance.
III. Data flow view for analysis of the degree to which the architecture meets
functional requirements.
4. Evaluate quality attributes by considering each attribute in isolation. The number of
quality attributes chosen for analysis is a function of the time available for review and the
degree to which quality attributes are relevant to the system at hand. Quality attributes
for architectural design assessment include reliability, performance, security,
maintainability, flexibility, testability, portability, reusability, and interoperability.
5. Identify the sensitivity of quality attributes to various architectural attributes for a specific
architectural style. This can be accomplished by making small changes in the architecture
and determining how sensitive a quality attribute, say performance, is to the change. Any
attributes that are significantly affected by variation in the architecture are termed
sensitivity points.
6. Critique candidate architectures using the sensitivity analysis conducted in step 5. The SEI
describes this approach in the following manner.

Once the architectural sensitivity points have been determined, finding trade-off points is simply
the identification of architectural elements to which multiple attributes are sensitive. These six
steps represent the first ATAM iteration. Based on the results of steps 5 and 6, some architecture
alternatives may be eliminated, one or more of the remaining architectures may be modified and
represented in more detail, and then the ATAM steps are reapplied.

Architectural Complexity: A useful technique for assessing the overall complexity of a proposed
architecture is to consider dependencies between components within the architecture. These
dependencies are driven by information/control flow within the system. The three types of
dependencies:
1. Sharing dependencies represent dependence relationships among consumers who use the
same resource or producers who produce for the same consumers. For example, for two
components u and v, if u and v refer to the same global data, then there exists a shared
dependence relationship between u and v.
2. Flow dependencies represent dependence relationships between producers and
consumers of resources. For example, for two components u and v, if u must complete
before control flows into v (prerequisite), or if u communicates with v by parameters, then
there exists a flow dependence relationship between u and v.
3. Constrained dependencies represent constraints on the relative flow of control among a
set of activities. For example, for two components u and v, u and v cannot execute at the
same time (mutual exclusion), then there exists a constrained dependence relationship
between u and v.

Architectural Description Languages: Architectural description language (ADL) provides a


semantics and syntax for describing a software architecture. Hofmann and his colleagues suggest
that an ADL should provide the designer with the ability to decompose architectural components,
compose individual components into larger architectural blocks, and represent interfaces
(connection mechanisms) between components. Once descriptive, language based techniques for
architectural design have been established, it is more likely that effective assessment methods for
architectures will be established as the design evolves.

 Architectural Mapping Using Data Flow: There is no practical mapping for some architectural
styles, and the designer must approach the translation of requirements to design for these
styles in using the techniques. To illustrate one approach to architectural mapping, consider
the call and return architecture—an extremely common structure for many types of systems.
The call and return architecture can reside within other more sophisticated architectures. For
example, the architecture of one or more components of a client-server architecture might
be call and return. A mapping technique, called structured design, is often characterized as a
data flow-oriented design method because it provides a convenient transition from a data
flow diagram to software architecture. The transition from information flow (represented as
a DFD) to program structure is accomplished as part of a six step process:
1. The type of information flow is established,
2. Flow boundaries are indicated,
3. The DFD is mapped into the program structure,
4. Control hierarchy is defined,
5. The resultant structure is refined using design measures and heuristics, and
6. The architectural description is refined and elaborated.

Transform Mapping: Transform mapping is a set of design steps that allows a DFD with
transform flow characteristics to be mapped into a specific architectural style.

Step 1. Review the fundamental system model. The fundamental system model or context
diagram depicts the security function as a single transformation, representing the external
producers and consumers of data that flow into and out of the function. The below diagram
depicts a level 0 context model, and next one shows refined data flow for the security
function.
Context Level DFD for SafeHome Security

1st Level DFD for SafeHome Security

Step 2. Review and refine data flow diagrams for the software. Information obtained from the
requirements model is refined to produce greater detail. For example, the level 2 DFD for
monitor sensors is examined as shown in diagram,

Level 2 DFD that refines the monitor sensors transform

and a level 3 data flow diagram is derived as shown in the diagram. The data flow diagram
exhibits relatively high cohesion.
Level 3 DFD for monitor sensors with flow boundaries

Step 3. Determine whether the DFD has transform or transaction flow characteristics.

Evaluating the DFD, we see data entering the software along one incoming path and exiting
along three outgoing paths. Therefore, an overall transform characteristic will be assumed for
information flow.

Step 4. Isolate the transform center by specifying incoming and outgoing flow boundaries.
Incoming data flows along a path in which information is converted from external to internal
form; outgoing flow converts internalized data to external form. Incoming and outgoing flow
boundaries are open to interpretation. That is, different designers may select slightly different
points in the flow as boundary locations. Flow boundaries for the example are illustrated as
shaded curves running vertically through the flow in the diagram. The transforms (bubbles)
that constitute the transform center lie within the two shaded boundaries that run from top
to bottom in the figure. An argument can be made to readjust a boundary. The emphasis in
this design step should be on selecting reasonable boundaries, rather than lengthy iteration
on placement of divisions.

Step 5. Perform “first-level factoring.” The program architecture derived using this mapping
results in a top-down distribution of control. Factoring leads to a program structure in which
top- level components perform decision making and low level components perform most
input, computation, and output work. Middle-level components perform some control and do
moderate amounts of work.

Step 6. Perform “second-level factoring.” Second-level factoring is accomplished by mapping


individual transforms (bubbles) of a DFD into appropriate modules within the architecture.

Step 7. Refine the first-iteration architecture using design heuristics for improved software
quality. A first-iteration architecture can always be refined by applying concepts of functional
independence. Components are exploded or imploded to produce sensible factoring,
separation of concerns, good cohesion, minimal coupling, and most important, a structure
that can be implemented without difficulty, tested without confusion, and maintained
without grief.

Refining the Architectural Design: Refinement of software architecture during early stages of
design is to be encouraged. Alternative architectural styles may be derived, refined, and
evaluated for the “best” approach. This approach to optimization is one of the true benefits
derived by developing a representation of software architecture.
It is important to note that structural simplicity often reflects both elegance and efficiency.
Design refinement should strive for the smallest number of components that is consistent
with effective modularity and the least complex data structure that adequately serves
information requirements.

User Interface Design: User Interface Design creates an effective communication medium between a
human and a computer. Following a set of interface design principles, design identifies interface
objects and actions and then creates a screen layout that forms the basis for a user interface
prototype. Interface design focuses on three areas of concern:

1. The design of interfaces between software components,


2. The design of interfaces between the software and other non-human producers and
consumers of information (i.e., other external entities), and
3. The design of the interface between a human (i.e., the user) and the computer.

User Interface Design Principles: Theo Mandel coins three golden rules for user interface design:

1. Place the user in control.


2. Reduce the user’s memory load.
3. Make the interface consistent.

Place the user in control: During a requirements gathering session for a major new information
system, a key user was asked about the attributes of the window oriented graphical interface. Mandel
defines a number of design principles that allow the user to maintain control:

1. Define interaction modes in a way that does not force a user into unnecessary or undesired
actions.
2. Provide for flexible interaction.
3. Allow user interaction to be interruptible and undoable.
4. Streamline interaction as skill levels advance and allow the interaction to be customized.
5. Hide technical internals from the casual user.
6. Design for direct interaction with objects that appear on the screen

Reduce the user’s memory load: The more a user has to remember, the more error-prone will be the
interaction with the system. It is for this reason that a well-designed user interface does not tax the
user’s memory. Whenever possible, the system should ―remember pertinent information and assist
the user with an interaction scenario that assists recall. Mandel defines design principles that enable
an interface to reduce the user’s memory load:

1. Reduce demand on short – term memory.


2. Establish meaningful defaults.
3. Define shortcuts that are intuitive.
4. The visual layout of the interface should be based on a real world metaphor.
5. Disclose information in a progressive fashion

Make the interface consistent:

The interface should present and acquire information in a consistent fashion. This implies that

 All visual information is organized according to a design standard that is maintained


throughout all screen displays,
 Input mechanisms are constrained to a limited set that are used consistently throughout the
application, and
 Mechanisms for navigating from task to task are consistently defined and implemented.

Mandel defines a set of design principles that help make the interface consistent:

 Allow the user to put the current task into a meaningful context.
 Maintain consistency across a family of applications.
 If past interactive models have created user expectations, do not make changes unless there
is a compelling reason to do so.

The User Interface Design Process: The design process for user interfaces is iterative and can be
represented using a spiral model. Referring to Figure below, the user interface design process
encompasses four distinct framework activities:

 User, task, and environment analysis and modeling


 Interface design
 Interface construction
 Interface validation

The spiral shown in Figure below implies that each of these tasks will occur more than once, with each
pass around the spiral representing additional elaboration of requirements and the resultant design.
In most cases, the implementation activity involves prototyping—the only practical way to validate
what has been designed

User Interface Design Process

User, task, and environment analysis and modeling: The initial analysis activity focuses on the profile
of the users who will interact with the system. Skill level, business understanding, and general
receptiveness to the new system are recorded; and different user categories are defined.

For each user category, requirements are elicited. In essence, the software engineer attempts to
understand the system perception for each class of users. Once general requirements have been
defined, a more detailed task analysis is conducted. Those tasks that the user performs to accomplish
the goals of the system are identified, described, and elaborated (over a number of iterative passes
through the spiral). The analysis of the user environment focuses on the physical work environment.
Among the questions to be asked are:

 Where will the interface be located physically?


 Will the user be sitting, standing, or performing other tasks unrelated to the interface?
 Does the interface hardware accommodate space, light, or noise constraints?
 Are there special human factors considerations driven by environmental factors?
Interface Analysis: The information gathered as part of the analysis activity is used to create an
analysis model for the interface. Using this model as a basis, the design activity commences. The goal
of interface design is to define a set of interface objects and actions (and their screen representations)
that enable a user to perform all defined tasks in a manner that meets every usability goal defined for
the system.
Interface construction: The implementation activity normally begins with the creation of a prototype
that enables usage scenarios to be evaluated. As the iterative design process continues, a user
interface tool kit may be used to complete the construction of the interface.
Interface validation: Validation focuses on:

 The ability of the interface to implement every user task correctly, to accommodate all task
variations, and to achieve all general user requirements;
 The degree to which the interface is easy to use and easy to learn; and
 The users’ acceptance of the interface as a useful tool in their work.

Interface Design steps

User interface design, as the name suggests, refers to the creation of a model of a user interface
that is needed for some software. A UI designer performs this task. To build a good interface, the
designer must follow the interface design process, keeping in mind the end user.

How to design an interface?

Like any other process, following a step-by-step procedure is essential while designing an interface.
The design process can be iterative, but it is necessary to linearly follow the steps explained below:

Gathering the requirements: Before jumping straight to the implementation, it is necessary to


document the required functionality first. Gathering the requirements will help develop a clear view
of what must be done.

This step starts with a conversation with the client to know the basics of what they want from the
product. Later, in-depth interviews are conducted to dig deeper into detailed requirements. The
requirements gathered from here may include business requirements, in addition to design
requirements.

User analysis: A user-centered design cannot be implemented unless we get to know the users.
Therefore, this phase in the design process is crucial to extracting information about potential users.
The designer must conduct diligent research to understand the users and their needs. For this, the
designer can use information-gathering techniques like interviews and questionnaires.

Another technique for determining the actual and potential users is user profiling. User profiles
describe the users and their characteristics in terms of interface design. This information is then used
to establish user needs around which the interface design will be shaped.

Contextual task analysis: In this step, the main tasks to be performed using the interface must be
observed and evaluated. This process should be done in the context of the environment where the
interaction will be carried out. The contextual task analysis will help the designer generate essential
user-centered design guidelines.

Contextual task analysis aims to construct an interface design that supports the users’ work tasks. It
involves interviews and observations of the users performing the tasks in their natural environment.
It is also essential to study the physical and sociocultural environment of the job context. The data
collected from the observations is analysed and used to extract design requirements.

Implementation: This step is where the designer finally implements the design according to the
requirements gathered in the previous stages. The implementation can be subdivided into further
stages like visual design, navigation design, and documentation of screen design standards.

The detailed design is then carried out. The designer implements the windows, dialog boxes,
navigation, and all other elements that make up the user interface. It's vital to make the design
aesthetically pleasing and maximize the usability and user experience.

Testing: Before deploying the final product, it is crucial to test it out. Testing an interface design may
include, but is not limited to, usability and user testing.

Testing is conducted using prototypes that are simulations of the final interface design. These
prototypes are presented to end-users and clients, who perform actions on them and evaluate them.
Improvements to the design are made as suggested, and it is deployed only after producing an
acceptable result from iterative testing.

You might also like