TMS configuration:
If you want to configure TMS, you should login with 000 client and DDIC user into domain controller
system, it should be Development system
First login to Domain controller system and go to STMS T-code, pop up will come and enter the
controller system details in that pop up box and save
Then login to other system (example quality system) with 000 client and DDIC user and go to STMS
T-code, same pop-up box will appear, in the pop up click on other configuration button, enter the
Target system details like Host and instance and save it
Same for Production system as well
Then login to Domain controller system and approve the Quality and production system
STMS-overview-systems-select the system and approve.
Then need to configure the Transport Routes and layers
Routes can define where the object can be move from one system to other
Routes 2 types: consolidation route, Delivery route
Layers can be defined the path, where the object can be moved from one system other system
Layers are 2 types, ZSID layer and SAP Layer
ZSID: customizing objects
SAP: Standard
==============================================================================
TR import log running:
First, I must check TR logs in the STMS_Import T code,
Then I will check RDDIMPDP job is running or not, if not I will schedule the job by using RDDNEWPP
program
I will check cofiles and datafiles are available in usr/sap/Trans directory
I will check TP is connected to SID or not by using TP connect SID command in os level
I will check tmp directory updating or not in usr/sap/trans, if it is not updating will check is there any
files are created with the names like .lob and .loc, if those are available will rename the file names
I will check TRJOB and TRBAT tables, is there any old data available
TR import failed:
I will check the TR log files 1st, based on log file, and return codes will check further things, like if RC
08 then we should connect with the TR owner if it is above return code means
I will check the r3trans -d to check the DB connection
Then will check may be TP and R3trans outdated
I will check in os level that, cofiles and data files doesn’t have permissions
I will check in log files like A log, U log, S log for error information
========================================================================
Kernel Upgrades:
1st I will download the database dependent files and database independent files from marketplace
those are SAPEXE and SAPEXEDB
I will create one folder and will copy those files into that folder, and I will uncar the files by using
SAPCAR -xvf command
Then stop the sap
Stop the services
Take the backup of old kernel directory
Replace the latest kernel to old directory files
Run the SAPROUTE.sh command
Start services, start sap
Under global directory we have upgrade the kernels: Usr/sap/SID/sys/exe/uc/Linux/
Whenever you start SAP system, SAPCPE.exe will trigger and copy the global directory kernels into
instance directory kernel location
============================================================================
SAP Note implementation:
SAP note can be implemented in development system only, whenever implement the sap note it will
create one Transport request, that request we must import into other systems.
SNOTE is the T code to download and import the sap notes
Go to note browser-select the note and implement
Before implementing the note, we should check the note status: can be implemented
==========================================================================
Support package upgrade by using SUM tool:
We must perform some Pre steps, those are
RZ70→SLD→LMDB→marketplace configuration should be sync then only we can download the
stack.xml from maintenance planner
space check for OS 40gb and DB 100gb==== SPS upgrade
EHP upgrade= OS 100 GB, DB 200 GB
Check the maintenance certificate in slicense t code
Take the backup of the DB
Check the open TRs and update requests
Implement the SAP Notes if required
Upgrade SPAM and SAINT versions to latest
Check the host agent minimum version 7.21
create new folder under usr/sap/SID/sum and copy the sum software and uncar it by using SAPCAR -
xvf command
once we uncar the sum, some folders will be created and those are ABAP, Startup, JVM like that
once we perform the pre steps then launch the SUM tool
command is:
. /Startup confighostagent SID
Sum will start and it will give you the URLS
For HTTP: port is 1128
For HTTPS: port is 1129
Copy the URL and open it in browser, once we launch in browser sum phases will start those are
Extraction
Configuration
Checks
Preprocessing
Execution
Post processing
In extraction phase it will ask you for the stack.xml file location, DDIC password, System user
password, and SIDADM passwords
In configuration phase it will ask you for the configuration modes like single, standard, advanced
I have selected the standard on that time. These modes can be used for downtime minimization
purpose, then enter the uptime and downtime for SQL, R3trans, R3load, ABAP
It will ask you for the Shadow instance SID and Instance number
Select the SGEN, batch job server, and it will ask you if any sap notes needs be implemented
In check phase, Sum will check Is there any space needs to be increased, we can check space
requirement in checks.log file
In preprocessing phase, it will ask you to Lock the development workbench, if you click on OK
development workbench will be locked then users cannot be able to perform any development tasks
After that shadow instance will be created and data will be copied to the shadow instance by using
R3load process, in shadow process SPDD will be prompt for the data dictionary objects.
With the help of Developer, we can modify the data dictionary objects in shadow instance
Note: we must login to shadow instance with the 000 client and DDIC user and activate the one
user then re-login with that user
Whenever you modify the SPDD one TR will be created, that TR we can import into other systems
After that it will ask you to take the Backup, click on next to execution phase
In Execution phase, date will be executed, and it will perform the upgrade activities and copy the
upgraded data from shadow instance to original instance.
In Post processing phase, it will prompt for SPAU (repository data), this activity also we can connect
with developer, and it can be done within 14 days
It will perform SGEN and cleanup activities.
SUM log files can be checked in right side of SUM screen only and
In OS level usr/sap/SID/SUM/ABAP/log
DB Refresh:
Pre steps: First we need to login to source system (Production) and perform the 2 activities like
Take the backup of the source system and generate the trace file using the command
Alter database backup control file to trace.
Then copy these 2 files in to target system (Quality or Test) and perform some pre steps in target
system like,
Take the back up of target system,
Take the screen shots of some t codes like STMS, WE20 for partner profiles, SCC4, RZ04, SMLG, like
that
Create one request and add the RFC tables (RFC New, RFC Des…...), USR02 table, BDLS tables into
that request
Export the user master data using SCC8 t code
Check the TRs which are needs to be reimport after refresh with functional and ABAP teams
Open the trace file which we have copied from source system and remove the lines above Nomount
and below character set UTF8 line and rename the source SID to Target SID, RESET to SET logs and
archive to noarchive mode
Once we complete all the pre steps, then we can restore the DB by using BRRestore command by
using the backup and trace file.
Post steps are 2 types: one is from DB side and other one is from SAP application side
First, we need to perform DB post steps like generating the control file
Rename the trace file to control.sql, then
Start the DB in Mount phase and execute the @control.sql command, control file will be generated.
And execute the reset log files to set by using command Open reset log files
Then we have drop and regenerate the OP$ user if required (for older versions).
And start the DATABASE in OPEN phase
Coming to SAP application side post activities are
First, we need set Background job parameter to 0(ZERO) in OS level, then start the sap application
and suspend all background jobs by using BTCTRNS1 report in SE38
Then, perform the post activities like configuring required data using the screen shots which we have
taken in pre steps,
Change the background job parameter to previous value and restart the system and resume the
background jobs by using BTCTRNS2 report in SE38
Then import the TRs like export TR and Tables TR
Perform the logical systems conversion by using BDLS T code (it will take 6 to 8 hours).
Reimport the TRs if required.
Give handover the system to users
If users are not able login to the system:
In that situation, first I will check from my end that I can able to login or not, if not then I will check
SAP is up and running, if it is OK, then I will use dpmon command to check the work process status
that all work process occupied or not, if occupied then I will kill the process, this is one of the
possible solution and remaining’s are if it is oracle DB, then I will check ORAARCH directory full or
not, if we observe that ORAARCH directory is full then we can copy old files into some other location
May be GUI network issue or not we can check.
If user complaints that they are facing slowness issue:
Whenever I get this type of issue, immediately I will check from my end that system response type in
SMLG t code, based on response time if it is High then we should check, is there any programs are
executing in SM37 and SM50 about work process, here only we can identify some reasons like what
are the processes are running and all, then I will go and check in ST03 t code about the dialog
response time, Here we can check the particular user response time, particular transaction response
time, based on the response time we can conclude that where it is taking long time like DB side, CPU
side, Role in time or Role out time or RFC side.
And we can check in STAD t code for statistical analysis, it will give you the detailed analysis like
every second what are the programs are being executed and their response time, memory details
and user details
If slowness issue for transaction/user then we can activate the trace, here we can find details about
the slowness where it is taking long time like that.
Client creation
Scc4 is the t code to create the client, before creating the client we should create the logical system
to uniquely identify the client in landscape, BD54 or SALE is the t codes to create logical system.
Once you create the client by using SAP* user and Password is PASS we can be able to login
If not able to login SAP* user, then we need to set the parameter
login/no_automatic_user_sapstar=0 (enable), 1 (disable)
Client copies are 3 types
Local client copy: it can be used to copy the client data between the clients and within the system
Login to target client and go to SCCL t code – select the client profile – schedule as background job
SCC3 is the T code to check all client copy log files
Remote client copy: it can be used to copy the client data between the clients and between the
systems
If you want to perform the remote client copy, you should create one RFC between the systems
Then login with target client and go to SCC9 t code – select the RFC destination and Client profile
then schedule background job.
Export and Import client copy: it can be used to copy the client data between the clients and
between the systems but if you want to perform export & import client copy, TMS should be
configured.
First login to source client and go to SCC8 t code select the client profile- schedule as background job
3 requests will be created those are KO, KT, KX
Then login to target system client go to STMS_Import t code- select the KT request and import
Then go to SCC7 t code to perform post activities of client copy