Data Migration Plan
Data Migration Plan
1. Introduction
The overall plan for data migration is to consolidate, merged and reconcile the migration of data from existing
Maximo 4 systems into a single, multi-site enable Maximo 7.6 instance. This enables XYZ to reap the benefits of
a unified system, establishing standard maintenance practices and inventory management at an organisational
level, rather than a site level as is done today.
Additionally the massive improvements and new features added to the Maximo Asset Management product
between version 4 and 7.6 will equip XYZ with top-tier Asset Management system, enabling them to execute
quality asset management activities within their business, and move further from reactive toward preventative,
highly traceable and safer asset management.
1.1Document Purpose
The purpose of this document, as part of the XYZ Maximo Upgrade Project, is to define the plan for:
The consolidation and migration of existing data
Infrastructure and software requirements to perform migration
Client responsibilities for providing and cleansing source data
1.2Overview
This plan will describe the approach, deliverables, and required inputs to achieve successful data migration.
Data Migration will occur on a per site basis, with common master data being prepared for all in-scope sites,
and transactional data being prepared for each specific site.
1.3Audience
The intended audience for this document is as follows:
XYZ (Customer) Project Management Team
1.4Related Documents
This document should be read in conjunction with the following related documents:
Project Management Plan
XYZ Cloud Architecture Plan
1.5 Assumptions
These assumptions are required and need to be made available to support the completion of the activities
defined in this plan.
1.5.1 General
ABC and XYZ are committed to open and honest communication
ABC and XYZ are committed to the success of the project
1.5.2 Dependencies of XYZ resources
XYZ is able to provide resources that can provide current database backups of the source Maximo
environments as requested
Oracle database licenses to facilitate a single Data Migration environment purchased prior to build
phase commencement
Supporting tools such as IBM Data Conversion Workbench are able to be installed on Data Migration
environment
Oracle DB is at a patch level supported by IBM Data Conversion Workbench
1.5.4 Infrastructure
A data migration server environment will be available at beginning of build phase
Connectivity and access will be provided to ABC by XYZ to copies of the existing Maximo 4 applications
All data migration done by ABC will be done from MAXIMO 4 databases only. No external data sheets
will be used.
All data cleansing will be performed in the current MAXIMO 4 system.
Provide exports of all existing MAXIMO XYZ System Administrator Monthly or within one week
4 databases from request until Go Live
Provide clarifications on the values in XYZ System Administrator Weekly meeting until the start
existing database columns System Support Analyst of SIT
Maximo 7.6 licenses for development ABC / Commercial April 2017
Department
DB2 licenses for development project ABC / Commercial Complete
Department
IBM Data Conversion Workbench ABC / Development Team Complete
1.7 Risks
The following table lists the risks on delivering the Migration Plan:
Risk Impact Likelihood Mitigation Plan
(H,M,L) (H,M,L)
Significantly corrupted source data H L
2. Scope Identification
2.1 Overview
The overall data migration of the Maximo Upgrade Project will be conducted in four distinct phases.
1. Data Mapping
2. Consolidation and Merging
3. Data Migration
4. Data Reconciliation
The source Maximo 4 databases considered by this projects Data Migration Plan include only the below:
*2 – These items will be also interfaced from ORACLE Financials, but will be migrated in the first instance in order to
support overall system testing without dependency on the Financial Interface. Client can perform reconciliation of this
data such as Chart of Accounts and Companies, so that data in MAXIMO 4 is completely matching data in the Oracle
Financials.
Data is not relevant in the new system and there is no value in migrating it
Data will be provided separately in Excel, since current data needs to be improved before it is loaded
2.3 Additional Data Sets and Sources
Datasets that do not exist in the current system and must be prepared for Maximo 7.6 or data sets
that will be migrated from other systems into Maximo will be detailed in this section.
Data provided by the client in Excel sheets will focus on improving the business level data quality
through following activities:
• Identify business benefits and improvement goals
• Identify data improvements needed to support business benefits
• Conduct profiling for existing data that needs improvements
• Documents reference only active records
• Define data cleansing activities
• Define data restructuring activities
• Define new data sets required
• Decide if improvement/cleansing to be done before or after the upgrade.
• Factor in XYZ culture and engage with Change Management
Only AUD currency will be used and exchange rates are not managed in
5 CURRENCY, EXCHANGE
MAXIMO.
Complete contract details and line items can be loaded into the system, if
6 CONTRACT
prepared by the client in Excel sheets. Data templates will be provided to
collect this data.
Failure Class hierarchy should be prepared by the client in Excel sheets.
7 FAILUECLASS,
Data templates will be provided to collect this data.
PROBLEMCODE, CAUSE,
REMEDY Failure Codes are going to be completely restructured by the client in
Excel and loaded into MAXIMO 7.6. Existing data in MAXIMO 4 will be
remapped or defaulted and client is expected to provide mapping tables
between the old and new failure codes. For codes where there is no
mapping, the old failure codes will be migrated in the Failure Remark long
description field.
Measurement units used for meter and measurement reporting will need to
8 MEASUREUNIT
be defined by the client in Excel sheets. Data templates will be provided to
collect this data.
Existing lubrication custom application should be defined as the set of
9 EQX1 (Lubrication)
Preventive Maintenance and Job Plan records. Data should be prepared
by the client in Excel sheets. Data templates will be provided to collect this
data.
Existing list of tools needs to be expanded and cleansed. Data should be
10 TOOL
prepared by the client in Excel sheets. Data templates will be provided to
collect this data.
Job Plans and supporting Job Plan detail such as Operations, Materials,
11 JOBLABOR,
Labour, Services and Tools will be progressively or partially consolidated
JOBMATERIAL, by the client in the existing MAXIMO 4 or prepared separately in Excel for
JOBOPERATION, migration into MAXIMO 7.6 system into unified data set across all sites
JOBPLAN, JOBTOOL, and this exercise can continue in MAXIMO 7.6. Data Migration will be
JPASSETSPLINK migrating data available in MAXIMO 4 into MAXIMO 7.6. Since job plan
numbers will be changed, the client is expected to provide mapping tables
between the old and new job plan numbers.
Signature security profiles will be redefined for the new system. This data
12 LOCAUTH, MAXGROUPS,
will be defined by the client in the MAXIMO 7.6 test system and migrated
MAXUSERAUTH, through the built-in migration package capability.
MAXUSERGROUPS,
TOLERANCE,
USERGROUPAUTH
To be discussed and confirmed with the operational team, but expected to
13 WORKPRIORITY,
be configured similarly to existing MX4 systems
WORKTYPE
These sheets should be stored in a document management system accessible by ABC, so that at
each iterative release of data migration scripts the latest XYZ provided data can be tested and (where
successful) incorporated into the next database builds released to other environments.
Like data cleansing activities, these sheets should be scheduled for release by XYZ, with the intended
schedule available to ABC to facilitate planning for new sheets to be added to the data migration
process by ABC.
As the data migration progresses, we will have partial progressive releases in the test environment
that can be reviewed by client to verify how the data looks in the new system. Frequency of the
releases will be determined by the ABC project team.
3.2 Approach
1 Source databases restored to XYZ must provide recent backups of the source databases to
common environment ABC. These will be restored into a single database server for
analysis and script development.
2 Data dictionary generated from A list of tables and columns, types, length, required, same-as
Maximo 4 systems values etc will be generated from MX4.
3 Data dictionary generated from A list of tables and columns, types, length, required, same-as
OOTB Maximo 7.6 system values etc will be generated from MX76.
4. Data dictionary comparison The data dictionaries will be compared to identify columns
that:
Directly map
Require length increases in the 7.6 instance, fields that
might
Require some type conversion
Contain values in standard Maximo extension fields (e.g.
wo1 woeq1) that will require remapping, consolidation into
long descriptions or continued use of MX76 extension
fields
For new field defaults will be proposed and XYZ can
confirm if the proposed default value is correct or can be
populated in different way.
5 Exception management Any source data columns that are listed as in-scope that do
not have an easily identified destination column will be
referred to the ABC architect for review and decision as to
whether the data is stored in an existing MX76 column, a new
column to be added, or as free-text / long description / work
log etc. These data mapping and migration decision during the
data migration process will be communicated to XYZ
representatives for review and confirmation.
Columns that require type conversion will be provided as a list
to the ABC architect for confirmation before being added to
the migration scripts.
All columns that are required in MX76 that are do not have an
equivalent source column from MX4 will be collated and
provided as a list of columns and suggested default values.
XYZ will be required to review and agree to those default
values. Once agreed these will be added to the scripts used
for migration.
3.3 Recommendations
Changes to the MX4 data dictionary should be frozen at commencement of the build phase.
A formal issue tracking system should be implemented so that ABC can raise issues that
require attention by XYZ data cleansing team members. This will ensure visibility and
traceability of issues with source data that are uncovered during data migration preparation
activities and testing.
Relevant, existing columns in MX76 should be preferentially used as the destination for MX4
columns. New custom columns are to be considered a last resort when carrying forward
custom or deprecated legacy columns.
Custom columns with little, or completely uniform data will be reviewed to determine
redundancy before migration occurs.
Increases in decimal precision when moving from MX4 to OOTB MX76 are acceptable
Summary fields from MX4 that greatly exceed the field length of the related field in MX76 will
be truncated and the full value stored in (or added to) a long description field where available.
Where re-mapping is required for source data to ensure MX7 compatibility the data migration
team will perform the mapping and incorporate that mapping into the data migration scripts.
Where client input is required XYZ will be provide the MX76 target values and be asked to
provide transformation/mapping rules for ABC to implement.
To allow the profiling to be accurate, production data from in-scope Source Systems (as listed
above) is required during this activity. Use of the current production environment is not
recommended as the analysis process could have an adverse effect on the performance of
the environment.
Access to an isolated DM environment containing databases periodically refreshed from a
single-point of time, from their production parents will be required to perform this task. This
environment sufficiently powerful specifications and storage to run active copies of all source
databases (Oracle), a consolidated IBM DB2 (or destination database) database, a data
migration tool such as IBM Data Conversion Workbench (or relevant tool used) and smaller
tools such as Notepad++.
Data cleansing activities performed by XYZ should be performed as staged releases with
clear objectives and release dates, which are communicated with the Data Migration team.
This ensures that the Data Migration team can plan and react appropriately to changing data.
The Location and Asset hierarchies will be cleansed and restructured in the source MX4
systems by the client. ABC will not take part in the data cleansing of any MAXIMO data.
All data cleansing activities will be reflected directly in MAXIMO 4, so migration scripts
prepared by ABC will only consider the data that is in the current 10 of the MAXIMO 4
databases. This will remove any dependency between ABC data migration activity and XYZ
Data Cleansing activity.
3.7 Deliverables
This execution plan will be vigorously tested for accuracy and repeatability
prior to go-live, in cyclical unit, integrated system and ultimately UAT
testing.
4.2 Approach
Where there are conflicts for these records the each record will have a suffix
appended to its unique identifier. Current decision is to renumber only
duplicate locations and assets and leave the rest unchanged as they are
referenced on the drawings and asset plate tags. For duplicate assets
renumbering will be done by adding 1 and 2 to respective assets.
Client merging / consolidation may happen in two areas :
2 Client merged/consolidated
data
Source databases : Records such as locations (and their positions
in location hierarchies) may be updated in the source systems,
which will trickle through to data migration via the continuously
provided database exports that are used as input to data migration
design and testing
Load templates: Some data may be merged by XYZ and provided
as data load sheets. Where mapping is required to consolidate
records XYZ will provide the consolidation rules, which will in turn
be integrated into the data migration scripts by ABC.
Most data sets (such as Work Orders) will simply be migrated from its single
3 Site transactional data
site source system into the Maximo 7.6 with an additional reference to the
site per record.
Some records such as Work Orders are referenced by an ID column
4 Cross-site ID duplication
(WorkorderID in this example) on related records such as Long Description.
Where duplication exists across sites, update scripts will be created to
negotiate and replace those duplications, preserving relationships between
parent and child records.
# Rule Description
1 Inventory balances >= 0 No inventory item with a balance of -1 or less will be migrated.
2 Mandatory fields must be populated Any fields marked as mandatory on the source system must be
populated with valid values.
3 Domain-based values must be valid Any attribute that has a value associated with a domain/valuelist
must have a value that exists in that domain.
4 All assets must exist in a correctly All assets (except top-level assets) must have a direct ancestry
defined hierarchy path to a single top-level asset.
No asset should have an ancestor that is subsequently also a
descendant (no loops).
No asset should refer to a parent that does not exist.
All assets parented by decommissioned asset should also be
decommissioned
5 All locations must exist in a correctly All locations (except top-level locations) must have a direct
defined hierarchy ancestry path to a single top-level location.
6 No orphan records / invalid foreign keys No record should refer to a parent record that does not exist. (e.g.
an Asset that references a parent Asset record that does not
exist)
No record should reference another record that does not exist
(e.g. a Work Order that references a non-existent Asset)
7 Unique primary keys Primary keys within a single Maximo 4 instance must be unique.
4.4 Deliverables
5.2 Approach
The approach below will be an iterative process (for all steps prior to step 14) that will be deployed on
a source site by site basis. This means that a complete script will exist for each source site to migrate
that site’s complete data set, allowing data migration to be performed in a staged format. These
separate scripts will be tested both separately and as a single overall migration to ensure duplicate
key issues are detected and resolved prior to integrated system testing.
1 Database backups received For testing, as data is being migrated from 10 different
databases, it is critical to be able to obtain latest database
exports for all 10 systems within one week from notification.
Client will establish process to provide upgrade project with full
database exports of all MAXIMO 4 systems within one week
from request.
2 Maximo 7.6 environment A Data Migration Maximo (and required add-ons) environment
prepared for migration will be prepared in advance of Data Migration execution. Whilst
some customisations are expected during implementation, a
base instance of Maximo will allow the data migration team to
initiate data migration.
3 Maximo 7.6 DB backup The base Maximo 7.6 system database should be backed up
prior to any attempted Migration. This ensures a restoration
point has been established allowing for roll-back.
4 Maximo 7.6 staging database A copy of the MX7.6 database will be created as a staging
prepared database. All data will be loaded either by script or using a
migration tool into the staging database.
5 System data load into Staging Scripted load of system load data, in the following order :
Database
1. Org and Site
2. System Variables
3. Setup Options
4. Value List Domains
5. Site Addresses
6. Bill To and Ship To
7. Default Accounts
8. Location Systems
9. Default Locations
10. Calendars
11. Holidays
12. Shifts
13. Work Periods
14. Work Types
15. Work Priorities
16. Tax Codes
17. Currencies
18. Exchange Rates
19. Document Types
20. Attachment Folders
21. Migration Packages
22. Migration Objects
6 Source database(s) restored to All Maximo 4 databases in scope will be restored into the
common staging environment Staging / Data Migration environment (or a directly connectable
environment with high speed connectivity) from current
backups.
SSMA conversion applied to The data mapping rules developed during the Data Mapping
source data phase will be used as input for conversion. Data will be
converted and stored into the staging environment.
SSMA (or relevant tool selected) is a tool specifically developed
to efficiently plan for and automate conversion of Oracle
databases into SQL Server (or destination database) format.
7 Supporting data conversion SSMA load of supporting data, in the following order :
into staging database
23. People Master
24. Crafts and Skills
25. Labour
26. Premium Pay
27. Craft Rates
28. Person Groups
29. Financial Periods
30. GL Components
31. Chart of Accounts
32. Companies
33. Purchasing Terms
34. Ordering Units
35. Oracle Projects
36. Oracle Project Tasks
37. Oracle Categories
38. Oracle Exp. Types
39. Commodity Groups
40. Linked Documents
41. Attribute Domains
42. Spec. Attributes
43. Classifications
44. Class Attributes
45. Measurement Units
46. Meter Master
47. System Users
48. Security Groups
49. Start Centres
8 Master data conversion into SSMA load of master data, in the following order :
staging database
50. Locations
51. Assets (Equipment)
52. Asset Classifications
53. Asset Specifications
54. Asset Meters
55. Asset Spare Parts
56. Condition Points
57. Job Plans
58. JP Operations
59. JP Labour
60. JP Material
61. JP Services
62. JP Tools
63. Failure Codes
64. Failure Classes
65. Routes
66. Route Stops
67. Preventative Main.
68. PM Meters
69. PM Job Sequence
70. Item Master
71. Item Vendors
72. Storerooms
73. Inventory
74. Inv. Balances
75. Tools
76. Service Items
9 Transactional data conversion SSMA load of transactional data, in the following order :
into staging database
77. Work Orders
78. Work Order Tasks
79. Labour Transactions
80. Inventory Reservations
81. Inventory Issues
82. Inventory Transfers
83. Inventory Adjustments
84. Tool Usage Trans.
85. Material Receipts
86. Service Receipts
87. Purchase Requisitions
88. PR Lines
89. Request for Quotations
90. RFQ Lines
91. Purchase Orders
92. PO Lines
93. PO Event Log
94. Invoices
95. Invoice Lines
96. Price Contracts
97. Price Contract Lines
98. Labour Contracts
99. Warranty Contracts
100. Lease Contracts
101. Lease Item Lines
102. Meter Readings
103. Measurement History
10 Reconciliation and integrity The prepared reconciliation report(s) will be executed against
checks the Staging environment to ensure the data volumes and
financial totals expected have been successfully migrated to the
Staging database.
Integrity testing will be performed using the IBM Maximo
Integrity Checker, to assure the data migration team that no
integrity errors were introduced in the last build.
13 Unit Testing / Compatibility After each iterative release Unit testing will be performed to
testing detect both data migration and data quality issues that are
apparent in the application.
The conversion from MX4 to MX76 requires significantly more
complex data insertion to ensure that all MX76 internal rules are
met successfully. Testing of these internal rules will be
performed by the data migration team to reduce downstream
impact.
15 Application issue resolution Investigated, troubleshoot, resolve and deploy fixes for all
application errors caused by the data migration. Data migration
issues will be result in updated scripts to be incorporated in the
next release. Data quality issues will be reported back to XYZ
for remediation.
14 Integrated system / User Integrated System testing and End User testing to be performed
Acceptance Testing by the client for verification and sign-off.
15 Application issue resolution Investigated, troubleshoot, resolve and deploy fixes for all
application errors caused by the data migration. Issues will be
resolved in line with agreed impact/priority timelines (not
described in this document).
1 Purchase Order closeout Agreement was reached during the data migration workshop to
about the approach to migrate the In-Flight-Data. The agreed
approach is to recreate new Purchase Orders for all records
with partial receipts and close the old Purchase Orders, so they
cannot be received against any longer. New Purchase Orders
will have only outstanding quantity due for receiving. At the
system roll out these new Purchase Orders will be loaded by
the data migration and bulk re-approved in the system, so they
get interfaced to the Oracle Applications at this point in time.
Similar exercise was done several years ago when chart of
accounts was changed and existing Purchase Orders needed to
continue receiving against the new GL Accounts.
It was agreed that both, the new and existing Purchase Orders
will have old PO and new PO fields respectively, to allow
procurement officers to identify the new PO number to be used
for receiving and invoicing. Accounts Payables and Suppliers
will need to be notified and it is expected to have several
hundreds of POs affected by this change.
Purchase Orders which are partially or fully received with un-
invoiced receipts, will have their non-invoiced receipts reversed
against the old PO and then received again against the new PO
number to facilitate streamlined invoicing process.
2 Shut down period In order to lock down data entry, access should be restricted to
the Maximo 4 system 24 hours prior to migration to read-only
access only.
4 Restricting document updates Changes to the Oracle financials should be restricted during the
go live cut-over period. Master data set in MAXIMO interfaced
from Oracle to MAXIMO will be reconciled after the go live to
ensure they are fully aligned. Invoice should not be entered in
Oracle during the g live cut over period.
5.5Validation rules
Where applicable, the data migration will be subject to following database integrity and system
validation rules in the destination system for all migrated data:
• Unique primary keys (no duplicate keys in data numbering)
• Valid foreign keys (referencing existing records)
• Valid relationship cardinality (one-to-one, one-to-many, many-to-many)
• Valid parent-child links in the hierarchy (no network loops in hierarchies)
• Mandatory fields are populated with default values if absent
• Default values populated for blank values
• Values are matching destination data type
• Values are within valid domains (value lists, number range, date ranges)
• Correct data key maps are applied where numbering has changed
• Standard Maximo use cases pass compatibility testing
5.6Tools
Data migration
Data migration will be performed using SQL Server Migration Assistant (SSMA) is a free supported
tool from Microsoft that simplifies database migration process from Oracle to SQL Server and Azure
SQL DB. SSMA automates all aspects of migration including migration assessment analysis, schema
and SQL statement conversion, data migration as well as migration testing.
Reconciliation reporting
Data migration reconciliation will be performed using SQL scripts that will be developed to list existing
and migrated record counts and financial totals.
The output of these scripts may be merged and presented using Microsoft Excel.
Integrity testing
The IBM Maximo Integrity Checker is provided as part of the Maximo Asset Management installation
and is designed to report upon Maximo and Database integrity issues,
5.7 Deliverables
Maximo 7.6 Database A MX76 database that comprises merged in-scope MX4 data, client
prepared data and auto-populated default values that meets validation and
reconciliation targets.
Database refresh system An automated system that allows a team to request an on-demand
database refresh that will replace all data in their Maximo environment with
the latest release of Data Migration prepared data.
Counted Sets
1. GL Components
2. Chart of Accounts
3. Companies
4. Oracle Projects
5. Linked Documents
6. Locations
7. Assets (Equipment)
8. Condition Points
9. Job Plans
10. Routes
11. Preventative Main.
12. Item Master
13. Inventory
14. Inv. Balances
15. Work Orders
16. Work Order Tasks
17. Purchase Requisitions
18. Request for Quotations
19. Purchase Orders
20. Invoices
21. Labour Transactions
22. Inventory Reservations
23. Inventory Issues
24. Inventory Transfers
25. Inventory Adjustments
26. Tool Usage Trans.
27. Material Receipts
28. Service Receipts
In addition to simple record counts, finance related records will be
2 Financial transaction
totalled in both the source and target system. Financial records that will
total comparison be compared will include:
The balances of each of the above data sets should be equal in Maximo
7.6 post-migration to the (locked) totals at time of migration.
1. GL Components
2. Chart of Accounts
3. Companies
4. Oracle Projects
5. Linked Documents
6. Locations
7. Assets (Equipment)
8. Condition Points
9. Job Plans
10. Routes
11. Preventative Main.
12. Item Master
13. Inventory
14. Inv. Balances
15. Work Orders
16. Work Order Tasks
17. Purchase Requisitions
18. Request for Quotations
19. Purchase Orders
20. Invoices
21. Labour Transactions
22. Inventory Reservations
23. Inventory Issues
24. Inventory Transfers
25. Inventory Adjustments
26. Tool Usage Trans.
27. Material Receipts
28. Service Receipts
6.3 Deliverables
Record A report that directly compares the source Data Migration Analyst
Migration Report and destination record counts, financial
transaction total dollar values and Primary
Key matched sets for the in-scope data
sets
Integrity Check An extract of the Maximo Integrity Checker Data Migration Analyst
report report logs will be provided showing the
number of Integrity errors. Zero errors
should be reported. Warning or
informational level messages will not be
considered.