(SAP) TableConnector en
(SAP) TableConnector en
This software and documentation are provided only under a separate license agreement containing restrictions on use and disclosure. No part of this document may be
reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC.
U.S. GOVERNMENT RIGHTS Programs, software, databases, and related documentation and technical data delivered to U.S. Government customers are "commercial
computer software" or "commercial technical data" pursuant to the applicable Federal Acquisition Regulation and agency-specific supplemental regulations. As such,
the use, duplication, disclosure, modification, and adaptation is subject to the restrictions and license terms set forth in the applicable Government contract, and, to the
extent applicable by the terms of the Government contract, the additional rights set forth in FAR 52.227-19, Commercial Computer Software License.
Informatica, the Informatica logo, Informatica Cloud, and PowerCenter are trademarks or registered trademarks of Informatica LLC in the United States and many
jurisdictions throughout the world. A current list of Informatica trademarks is available on the web at https://www.informatica.com/trademarks.html. Other company
and product names may be trade names or trademarks of their respective owners.
Portions of this software and/or documentation are subject to copyright held by third parties. Required third party notices are included with the product.
DISCLAIMER: Informatica LLC provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied
warranties of noninfringement, merchantability, or use for a particular purpose. Informatica LLC does not warrant that this software or documentation is error free. The
information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation
is subject to change at any time without notice.
NOTICES
This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software
Corporation ("DataDirect") which are subject to the following terms and conditions:
1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES
OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT, BREACH
OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS.
The information in this documentation is subject to change without notice. If you find any problems in this documentation, report them to us at
[email protected].
Informatica products are warranted according to the terms and conditions of the agreements under which they are provided. INFORMATICA PROVIDES THE
INFORMATION IN THIS DOCUMENT "AS IS" WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING WITHOUT ANY WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND ANY WARRANTY OR CONDITION OF NON-INFRINGEMENT.
Table of Contents 3
Sort options. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Join conditions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Advanced properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Delta extraction for SAP table reader mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Delta extraction behavior. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Informatica custom table /INFADI/TBLCHNGN. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Update modes for delta extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Configuring a parameter file for delta extraction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Configure a table name override for delta extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Rules and guidelines for delta extraction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Configuring delta extraction for an SAP table reader mapping. . . . . . . . . . . . . . . . . . . . . . 39
Troubleshooting delta extraction for SAP Table Reader mappings. . . . . . . . . . . . . . . . . . . 39
Key range partitioning for SAP Table sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Configuring key range partitioning for SAP Table sources. . . . . . . . . . . . . . . . . . . . . . . . . 41
Best practices for key range partitioning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
SAP Table lookups in mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Configuring a mapping with an SAP Table source. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Creating a mapping task. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Mapping with an SAP Table source example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Step 1: Define the mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Step 2: Configure the SAP Table source. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Step 3: Configure the flat file target. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Step 4: Save the mapping and create a mapping task. . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
4 Table of Contents
Preface
Use SAP Table Connector to learn how to read from or write to SAP by using Cloud Data Integration. Learn to
create an SAP Table Connector connection, develop and run synchronization tasks, mappings, mapping
tasks, and data transfer tasks in Cloud Data Integration.
Informatica Resources
Informatica provides you with a range of product resources through the Informatica Network and other online
portals. Use the resources to get the most from your Informatica products and solutions and to learn from
other Informatica users and subject matter experts.
Informatica Documentation
Use the Informatica Documentation Portal to explore an extensive library of documentation for current and
recent product releases. To explore the Documentation Portal, visit https://docs.informatica.com.
If you have questions, comments, or ideas about the product documentation, contact the Informatica
Documentation team at [email protected].
https://network.informatica.com/community/informatica-network/products/cloud-integration
Developers can learn more and share tips at the Cloud Developer community:
https://network.informatica.com/community/informatica-network/products/cloud-integration/cloud-
developers
5
https://marketplace.informatica.com/
To search the Knowledge Base, visit https://search.informatica.com. If you have questions, comments, or
ideas about the Knowledge Base, contact the Informatica Knowledge Base team at
[email protected].
Subscribe to the Informatica Intelligent Cloud Services Trust Center to receive upgrade, maintenance, and
incident notifications. The Informatica Intelligent Cloud Services Status page displays the production status
of all the Informatica cloud products. All maintenance updates are posted to this page, and during an outage,
it will have the most current information. To ensure you are notified of updates and outages, you can
subscribe to receive updates for a single component or all Informatica Intelligent Cloud Services
components. Subscribing to all components is the best way to be certain you never miss an update.
To subscribe, on the Informatica Intelligent Cloud Services Status page, click SUBSCRIBE TO UPDATES. You
can choose to receive notifications sent as emails, SMS text messages, webhooks, RSS feeds, or any
combination of the four.
To find online support resources on the Informatica Network, click Contact Support in the Informatica
Intelligent Cloud Services Help menu to go to the Cloud Support page. The Cloud Support page includes
system status information and community discussions. Log in to Informatica Network and click Need Help to
find additional resources and to contact Informatica Global Customer Support through email.
The telephone numbers for Informatica Global Customer Support are available from the Informatica web site
at https://www.informatica.com/services-and-training/support-services/contact-us.html.
6 Preface
Chapter 1
To read data from SAP tables and write data only to SAP custom tables that have been created under the
customer namespace, you can use an SAP Table Connector connection.
You can read data from transparent tables, cluster tables, pool tables, views, and ABAP CDS views. The
Secure Agent accesses data through the application layer in SAP using ABAP. Data is streamed to the Secure
Agent through HTTP (s) protocol. SAP Table Connector supports joins and filters on the source tables.
You can also use SAP Table Connector to read data from an SAP ADSO object. To write data to an ADSO, use
SAP ADSO Writer Connector.
To optimize performance when the Secure Agent and the SAP system are in different networks, you can
enable data compression when you read data from SAP.
When you create a synchronization task, mapping,, mapping task, or data transfer task, Data Integration
generates a dynamic ABAP query to read from SAP tables and write to custom SAP tables. You can switch
mappings to advanced mode to include transformations and functions that enable advanced functionality.
Note: When you want to write data to SAP standard tables, SAP recommends that you use OData, BAPI/RFC,
or IDoc API to update data within the SAP application. For information about using an SAP Table Connector
connection to write data to SAP custom tables, contact Informatica Global Customer Support.
SAP objects
You can connect to SAP views, ABAP CDS views, transparent, pool and cluster tables using an SAP Table
connection.
Data Integration does not differentiate between tables and views. You extract data from views the same way
you extract data from tables. When you select a table, Data Integration displays the table name followed by
the business name in the Select Object dialog box. You can filter by table name or business name when you
connect to the SAP system.
• Source name
• Column names
• Business descriptions
• Data types, length, precision, and scale
7
ABAP CDS views
You can use an ABAP CDS view as a source or lookup in mappings.
Note: ABAP CDS views are supported from SAP NetWeaver system version 7.50 SP4 or later.
When you import an ABAP CDS view into Cloud Data Integration, the agent adds a prefix to the parameter
name. The prefix is used to indicate the parameter type.
• Mandatory Parameter. A parameter for which you need to specify a value. For example, in the field
paramM_P_FISCALYEAR, paramM is the prefix for a mandatory parameter that the agent adds. P_FISCALYEAR
is the parameter name that is a part of ABAP CDS views.
• Optional Parameter. When you define a parameter in SAP and use the annotation
@Environment.systemField, the parameter appears as an optional parameter in the list of fields.
If you do not provide a value, the optional parameter uses the ABAP system fields values.
For example, in the field paramO_P, paramO is the prefix for an optional parameter that the agent adds.
P_TOFISCALPER is the parameter name that is a part of ABAP CDS views.
The following image shows the mandatory and optional parameter when you select a CDS view:
In the example, paramO denotes an optional parameter and paramM denotes a mandatory parameter.
• When you configure an SAP source, configure row limits using the advanced source properties available
on the scheduling page of the task wizard. Row limits on the data filters page of the task wizard are not
enabled for SAP sources.
• Do not use tables as SAP Table sources if the sources have circular primary key-foreign key relationships.
• When you use more than one SAP table in a synchronization task, you can use one cluster table or one
pool table. If you use more than one cluster or pool table, errors occur at run time. You can use the Object
Search dialog box to see if a table is a cluster table or a pool table. You can use multiple transparent
tables in a task.
• When you join a cluster table or pool table with a transparent table, include all key fields in the transparent
table in the join condition. List the fields in the order that they appear in the SAP system.
• When you join a cluster table or pool table with a transparent table, use all of the source fields in the
transparent table that you use in the joins and filters in the field mapping. Also, map at least one field
from the cluster or pool table.
• Define relationships for multiple sources after the data preview displays the data. You can use the wizard
in advanced mode to avoid waiting to preview data.
• Data sorting does not apply on cluster or pool table fields.
- Configure the SAP parameter rdisp/max_wprun_time to allow more time for the read. For more
information, see the SAP documentation.
- To increase the amount of records that the Secure Agent can retrieve at one time, you can increase the
Java heap memory for the Secure Agent. To do this, edit the Secure Agent. In the System Configuration
Details section, select DTM and set the JVMOption1 property to the following value: Xmx512m. Click OK
to save the change and restart the Secure Agent. Adjust the value for the JVMOption1 property based on
the amount of records you want to retrieve and the available memory on the Secure Agent machine.
• For a lookup on an SAP object, configure the lookup to return less than 20 rows. Tasks might fail if the
lookup returns more than 20 rows.
• A lookup on an SAP object does not return matching rows if the lookup comparison value is NULL.
• When you define a reject file name for an SAP target, use the default name or the variable $ErrorFileName.
The $ErrorFileName variable uses the following convention for reject file name:
s_dss_<task name>_<run number>_error.csv.bad
• When you define a reject directory for an SAP target, use the variable $PMBadFileDir. When you use the
$PMBadFileDir variable, the synchronization task writes the reject file to the following Secure Agent
directory:
<SecureAgent_InstallDir>/main/rdtmDir/error
• Consider the following rules when you configure a mapping to read from CDS views:
- When you select a CDS view as an SAP source object in a mapping, you cannot preview the data.
- When you configure a mapping with a lookup transformation to look up records in CDS views with
parameters, only uncached lookup is supported.
- Delta extraction does not apply if you select a CDS view as an SAP source object.
- Do not use completely parameterized or advanced filters for CDS view parameters.
SAP objects 9
SAP Table Connector assets
Create assets in Data Integration to integrate data using SAP Table Connector.
When you use SAP Table Connector, you can include the following Data Integration assets:
For more information about configuring assets and transformations, see Mappings, Transformations, and
Tasks in the Data Integration documentation.
You can use the SAP Table connection type to read data from transparent tables, cluster tables, pool tables,
or views. You can also use the SAP Table connection type to write data to custom transparent tables.
To process data through SAP, you also need to verify the required licenses are enabled for the SAP system.
11
The following table lists the libraries corresponding to the different operating systems:
Linux 64 - libicudata.so.50
- libicui18n.so.50
- libicuuc.so.50
- libsapnwrfc.so
- libsapucum.so
Windows 64 - icudt50.dll
- icuin50.dll
- icuuc50.dll
- libsapucum.dll
- sapnwrfc.dll
Note: Verify that you download the most recent version of the libraries.
3. Copy the SAP NetWeaver RFC SDK 7.50 libraries to the following directory:
<Informatica Secure Agent installation directory>\apps\Data_Integration_Server\ext
\deploy_to_main\bin\rdtm
Create the deploy_to_main\bin\rdtm directory if it does not already exist.
Note: If you upgrade from a 32-bit operating system, the Secure Agent copies the 32-bit SAP NetWeaver
RFC SDK 7.50 libraries to the directory. You need to replace the 32-bit libraries with 64-bit libraries. If you
upgrade from a 64-bit operating system, you do not need to perform this step. The Secure Agent copies
the 64-bit SAP NetWeaver RFC SDK 7.50 libraries to the directory.
4. Set the following permissions for each NetWeaver RFC SDK library:
• Read, write, and execute permissions for the current user.
• Read and execute permissions for all other users.
5. From the SAP Support Portal, download the 64-bit SAP JCo libraries based on the operating system on
which the Secure Agent runs:
Windows sapjco3.jar
sapjco3.dll
Linux sapjco3.jar
libsapjco3.so
Note: Verify that you download the most recent version of the libraries.
6. Copy the JCo libraries to the following directory:
<Informatica Secure Agent installation directory>\apps\Data_Integration_Server\ext
\deploy_to_main\bin\rdtm-extra\tpl\sap
Create the deploy_to_main\bin\rdtm-extra\tpl\sap directory if it does not already exist.
Note: If you upgrade from a 32-bit operating system, the Secure Agent copies the 32-bit SAP JCo
libraries to the directory. You need to replace the 32-bit JCo libraries with 64-bit JCo libraries. If you
upgrade from a 64-bit operating system, you do not need to perform this step. The Secure Agent copies
the 64-bit SAP JCo libraries to the directory.
Windows ../bin/rdtm-extra/tpl/sap/sapjco3.jar;../bin/rdtm/javalib/sap/sap-adapter-
common.jar
Linux ../bin/rdtm-extra/tpl/sap/sapjco3.jar:../bin/rdtm/javalib/sap/sap-adapter-
common.jar
Note: If you copy the value directly from the table, the hyphens (-) in the value are incorrectly copied.
Copy the value to a text editor and make sure that the value you copied is not corrupted.
g. Click Save.
h. Repeat steps 2 through 7 on every machine where you installed the Secure Agent.
8. Restart the Secure Agent.
The following table describes the required authorization to read from SAP tables:
S_TABU_DIS / S_TABU_NUM Provide SAP table name from which you want to read data.
The following table describes the required authorization to write to SAP tables:
S_TABU_DIS / S_TABU_NUM Provide SAP table name where you want to write data.
• The transport files are applicable for SAP version ECC 5.0 or later.
• Verify that the transport files you installed on the SAP machines are the latest.
• Verify that the RSODPABAPCDSVIEW table is available in SAP before you install the
TABLE_READER_Addon transport files. If the RSODPABAPCDSVIEW table is not available, the
TABLE_READER_Addon transport installation fails.
• Before you install the transports on your production system, install and test the transports in a
development system.
The following table lists the transports that you need to install based on the SAP source type that you want to
access:
TABLE_READER_R900013.ER6 ER6K900013 To read data from SAP transparent tables, cluster tables,
TABLE_READER_K900013.ER6 and pool tables, install only the TABLE_READER transport.
TABLE_READER_Addon_R900085.S4N S4NK900085 To read data from ABAP CDS views, install both the
TABLE_READER_Addon_K900085.S4N TABLE_READER and TABLE_READER_Addon transports.
Use the TABLE_READER_Addon transports for SAP
NetWeaver 7.50 SP4 version and later.
Whenever you install the TABLE_READER transport, you
need to reinstall the TABLE_READER_Addon transport even
though there is no change in the TABLE_READER_Addon
transport version.
Note: Ensure that you first install the TABLE_READER
transport and only then install the TABLE_READER_Addon
transport.
1. Find the transport files in the following directory on the Secure Agent machine:
<Informatica Secure Agent installation directory>\downloads\package-SAPConnector.<Latest
version>\package\rdtm\sap-transport\SAPTableReader
2. Copy the cofile transport file to the Cofile directory in the SAP transport management directory on each
SAP machine that you want to access.
The cofile transport file uses the following naming convention: TABLE_READER_K<number>.ER6.
3. Remove "TABLE_READER_" from the file name to rename the cofile.
For example, for a cofile transport file named TABLE_READER_K900013.ER6, rename the file to
K900013.ER6.
To get and install the latest SAP Table Writer transport files, contact Informatica Global Customer Support.
Check out Prepare for configuration to learn more about the configuration prerequisites.
Connection details
The following table describes the basic connection properties:
Property Description
Runtime The name of the runtime environment where you want to run tasks.
Environment Select a Secure Agent or serverless runtime environment.
Username The user name with the appropriate user authorization to connect to the SAP account.
Application Sever The host name or IP address of the SAP application server.
If you enter the host name or IP address of the SAP application server in this field, you do not
need to enter the directory of the sapnwrfc.ini file in the Saprfc.ini Path field and the DEST entry
in Destination field.
Note: This property doesn't apply if you create the connection to write to SAP tables.
Advanced settings
The following table describes the advanced connection properties:
Property Description
Destination DEST entry that you specified in the sapnwrfc.ini file for the SAP application server.
Use all uppercase letters for the destination.
This property is required if you create the connection to write to SAP tables.
If you enter the DEST entry in this field, you do not need to enter the host name or IP address, and
system number of the SAP application server in the Application Sever and System Number fields.
Port Range HTTP port range. The SAP Table connection uses the specified port numbers to connect to SAP
tables using the HTTP protocol. Default range is 10000-65535.
Enter a range in the default range, for example, 10000-20000. When a range is outside the default
range, the connection uses the default range.
Test Tests the connection. When selected, tests the connection using both RFC and HTTP protocol. When
Streaming not selected, tests connection using RFC protocol.
Https When selected, connects to SAP through HTTPS protocol. To successfully connect to SAP through
Connection HTTPS, verify that an administrator has configured the machines that host the Secure Agent and the
SAP system.
Keystore Absolute path and file name of the keystore file to connect to SAP.
Location Specify both the directory and file name in the following format:
<Directory>/< Keystore file name>.jks
SAP Additional Additional SAP properties that the Secure Agent uses to connect to SAP.
Properties For example, you can define the load balancing parameters as shown in the following sample:
SAP uses the communications protocol, Remote Function Call (RFC), to communicate with other systems.
SAP stores RFC-specific parameters and connection information in a file named sapnwrfc.ini.
When you read data from SAP tables, if you define the path and file name of the sapnwrfc.ini file in the SAP
connection, the Secure Agent uses the sapnwrfc.ini file. However, if you define only the path of the
sapnwrfc.ini file in the connection, the Secure Agent first verifies if an sapnwrfc.ini file exists in the
specified path. If the sapnwrfc.ini file exists, the Secure Agent uses the sapnwrfc.ini file. Else, an
exception occurs.
To write data to SAP tables, you cannot use the sapnwrfc.ini file.
Use a DOS editor or WordPad to configure the sapnwrfc.ini file. Notepad can introduce errors to the
sapnwrfc.ini file.
After you create the sapnwrfc.ini file, copy the file to the following directory and restart the Secure Agent:
Note: If you are upgrading from an earlier version, you do not need to perform this step. The Secure Agent
copies the sapnwrfc.ini file to the directory.
Create this connection to enable communication between an RFC client and an SAP system. Each
connection entry specifies one application server and one SAP system.
The following sample shows a connection entry for a specific SAP application server in the
sapnwrfc.ini file:
DEST=sapr3
ASHOST=sapr3
SYSNR=00
Connection to use SAP load balancing
Create this connection to enable SAP to create an RFC connection to the application server with the least
load at run time. Use this connection when you want to use SAP load balancing.
The following sample shows a connection entry for SAP load balancing in the sapnwrfc.ini file:
DEST=sapr3
R3NAME=ABV
MSHOST=infamessageserver.informatica.com
GROUP=INFADEV
Connection to an RFC server program registered at an SAP gateway
Create this connection to connect to an SAP system from which you want to receive outbound IDocs.
The following sample shows a connection entry for an RFC server program registered at an SAP gateway
in the sapnwrfc.ini file:
DEST=sapr346CLSQA
PROGRAM_ID=PID_LSRECEIVE
GWHOST=sapr346c
GWSERV=sapgw00
DEST Logical name of the SAP system for the Use this parameter for the following types of
connection. connections:
All DEST entries must be unique. You need to - Connection to a specific SAP application server
have only one DEST entry for each SAP - Connection to use load balancing
system. - Connection to an RFC server program registered
at an SAP gateway
For SAP versions 4.6C and later, use up to 32
characters. For earlier versions, use up to
eight characters.
ASHOST Host name or IP address of the SAP Use this parameter to create a connection to a
application. The Secure Agent uses this entry specific SAP application server.
to attach to the application server.
R3NAME Name of the SAP system. Use this parameter to create a connection to use
SAP load balancing.
MSHOST Host name of the SAP message server. Use this parameter to create a connection to use
SAP load balancing.
GROUP Group name of the SAP application server. Use this parameter to create a connection to use
SAP load balancing.
PROGRAM_ID Program ID. The Program ID must be the Use this parameter to create a connection to an
same as the Program ID for the logical RFC server program registered at an SAP gateway.
system that you define in the SAP system to
send or receive IDocs.
GWHOST Host name of the SAP gateway. Use this parameter to create a connection to an
RFC server program registered at an SAP gateway.
GWSERV Server name of the SAP gateway. Use this parameter to create a connection to an
RFC server program registered at an SAP gateway.
TRACE Debugs RFC connection-related problems. Use this parameter for the following types of
Set one of the following values based on the connections:
level of detail that you want in the trace: - Connection to a specific SAP application server
- 0. Off - Connection to use load balancing
- 1. Brief - Connection to an RFC server program registered
- 2. Verbose at an SAP gateway
- 3. Full
1. At the command prompt, set the OPENSSL_CONF variable to the absolute path to the openssl.cfg file.
For example, enter the following command: set OPENSSL_CONF= C:\OpenSSL-Win64\bin\openssl.cfg
2. Navigate the <openSSL installation directory>\bin directory.
3. To generate a 2048-bit RSA private key, enter the following command:
openssl.exe req -new -newkey rsa:2048 -sha1 -keyout <RSAkey File_Name>.key -out <RSAkey
File_Name>.csr
4. When prompted, enter the following values:
• Private key password (PEM pass phrase). Enter a phrase that you want to use to encrypt the secret
key. Re-enter the password for verification.
Important: Make a note of this PEM password. You need to specify this value in some of the following
steps.
• Two letter code for country name.
• State or province name.
• Locality name.
• Organization name
• Organization unit name.
1. To configure the JVMOption property in Administrator to define the Secure Agent as a host that you can
add in the HTTP_Whitelist table of the SAP system, perform the following steps:
a. Select Administrator > Runtime Environments.
b. On the Runtime Environments page, select the Secure Agent machine that runs the mapping.
c. Click Edit.
d. In the System Configuration Details section, from the Service list, select Data Integration Server.
e. Edit any JVMOption field to add the following value:
-Dsap_whitelist_check=1
f. Click Save.
g. Repeat steps b through f for every Secure Agent that you want to define as a host in SAP.
Note: If you set the -Dsap_whitelist_check=1 value on the JVMOption property, you need to create
the entry for the Secure Agent in the HTTP_Whitelist table. If you do not create the entry, mappings
and tasks that run on SAP fail.
2. To create an entry for the Secure Agent in the SAP HTTP_Whitelist table using the transaction SE16,
perform the following steps in the SAP system:
a. Go to transaction SE16.
b. Configure properties to define the Secure Agent as a host in SAP.
The following table describes the properties that you need to configure:
Property Description
3. Repeat steps 1 and 2 for every Secure Agent that you want to configure as a whitelisted host in SAP.
You cannot create an SNC connection when you use the serverless runtime environment.
Before you use the serverless runtime environment for an SAP Table Connector connection, you need to
perform the prerequisite tasks.
1. Create the following structure for the serverless agent configuration in AWS: <Supplementary file
location>/serverless_agent_config
2. Add the libraries in the Amazon S3 bucket in the following location in your AWS account:
<Supplementary file location>/serverless_agent_config/sap
3. Copy the following code snippet to a text editor:
version: 1
agent:
dataIntegrationServer:
autoDeploy:
sap:
jcos:
- fileCopy:
sourcePath: sap/jco/<sapjco_libary_filename>
- fileCopy:
sourcePath: sap/jco/<sapjco_libary_filename>
nwrfcs:
- fileCopy:
sourcePath: sap/nwrfc/<rfc_libary_filename>
- fileCopy:
sourcePath: sap/nwrfc/<sapnwrfc_filename>
The following error displays when I test an SAP Table connection or use the connection in a task:
Test Connection Failed for <connection name>. Error getting the version of the native
layer: java.lang.UnsatisfiedLinkError: no sapjco3 in java.library.path.
Verify that the location of the sapjco3.dll file is in the to PATH variable for the Secure Agent machine.
The following error displays when I test an SAP Table connection or use the connection in a task:
Test Connection Failed for <connection name>. Error getting the version of the native
layer: java.lang.UnsatisfiedLinkError: no sapjco3 in java.library.path.
Add the location of sapjco3.dll to PATH variable and restart the Secure Agent.
A task that reads from SAP tables fails with the following error:
Error occurred processing data from SAP : Unable to establish Http Communication between
SAP server and agent! Shutting down reader.
The HTTP port is not open or the incoming request is being blocked by Windows Firewall. To resolve the
issue, in Windows Firewall, use the advanced settings to create a new incoming rule. Apply the rule to TCP
and all ports, and choose the HTTP-In protocol.
In advanced mode, the Mapping Designer updates the mapping canvas to include transformations and
functions that enable advanced functionality.
When you configure a mapping to describe the flow of data from source and target, you can also add
transformations to transform data. A transformation includes field rules to define incoming fields. Links
visually represent how data moves through the data flow.
After you create a mapping, you can run the mapping or you can deploy the mapping in a mapping task. The
Mapping Configuration application allows you to process data based on the data flow logic defined in a
mapping or integration template.
Use the Mapping task wizard to create a mapping task. When you create a mapping task, you select the
mapping or integration template for the task to use.
If you configured parameters, which are placeholders for information, in a mapping, you can define the
parameters in the mapping task. Defining parameters provides additional flexibility and allows you to use the
same mapping in multiple mapping tasks. For example, you can use a parameter for a source connection in a
mapping, and then define the source connection when you configure the mapping task.
When you create a mapping task, you can associate the task with a schedule to run it at specified times or on
regular intervals. Or, you can run it manually. You can also configure advanced session properties. You can
monitor tasks that are currently running and view details about completed tasks.
Specify the name and description of the SAP table source. Configure the source and advanced properties for
the source object.
26
The following table describes the source properties that you can configure in a Source transformation:
Property Description
The following table describes the SAP Table advanced source properties:
Property Description
Number of rows to The number of rows that are randomly retrieved from the SAP Table. Default value of zero
be fetched retrieves all the rows in the table.
Data Extraction You can use one of the following modes to read data from an SAP Table:
Mode - Normal Mode. Use this mode to read small volumes of data from the SAP Table.
- Bulk Mode. Use this mode to read large volumes of data from the SAP Table. Use bulk mode
for better performance. Bulk mode consumes more resources as compared to normal mode.
You might need to tune the packet size according to the available resources and data set to
increase the performance.
Default is normal mode.
Update Mode When you read data from SAP tables, you can configure a mapping to perform delta extraction.
You can use one of the following options based on the update mode that you want to use:
- 0- Full. Use this option when you want to extract all the records from an SAP table instead of
reading only the changed data.
- 1- Delta initialization without transfer. Use this option when you do not want to extract any
data but want to record the latest change number in the Informatica custom table /INFADI/
TBLCHNGN for subsequent delta extractions.
- 2- Delta initialization with transfer. Use this option when you want to extract all the records
from an SAP table to build an initial set of the data and subsequently run a delta update
session to capture the changed data.
- 3- Delta update. Use this option when you want to read only the data that changed since the
last data extraction.
- 4- Delta repeat. Use this option if you encountered errors in a previous delta update and want
to repeat the delta update.
- Parameter. When you use this option, the Secure Agent uses the update mode value from a
parameter file.
Default is 0- Full.
Parameter Name The parameter name that you defined for update mode in the parameter file.
for Update Mode
Override Table Overrides the SAP table name with the SAP structure name from which you want to extract delta
Name for Delta records that are captured with the structure name in the CDPOS table.
Extraction
Advanced Advanced properties for the SAP Table object to run mappings.
Properties If you specify more than one property, separate each property-value pair with a semicolon in the
following format: <Property name1>=<Property value1>;<Property
name2>=<Property value2>
For more information about the advanced properties, see “Advanced properties” on page 33.
Tracing Level Sets the amount of detail that appears in the log file.
Select one of the following tracing level options from the list:
- Terse
- Normal
- Verbose Initialization
- Verbose Data
Default is Normal.
Filter options
You can configure the Source transformation to filter data before the data enters the data flow. Use the
source query options to filter source data.
To filter data, configure the source query options on the Source tab of the Source transformation. Expand the
Query Options section, and configure the filter condition.
When you configure a filter, you can use either a simple or advanced filter. You can also use a parameter in a
filter expression and define the filter expression in the task.
To use a simple data filter, select a source object, source field, operator, and then enter the value.
You can also use a parameter in a filter expression in a simple data filter.
The following image shows the configured simple data filter that filters data from the BEDAT field in the
EKKO table when the date is less than or equal to 2016-01-29:
To use the $LastRunTime variable in a simple data filter, select a source object, source field, operator,
and then enter the value.
For example, to filter data from the CPUTM field in the BKPF table when the data is less than or equal to
the LastRunTime variable, use the following format: BKPF CPUTM <= $LastRunTime
The following image shows the configured simple data filter that filters data from the CPUTM field in the
BKPF table when the data is less than or equal to the LastRunTime variable:
To use a simple data filter in an ABAP CDS view object, select an ABAP CDS view source object, source
field, operator, and then enter the value.
For example, to filter data from the paramO_P2 field in the ZSAN_CDS_OPT_PARAM ABAP CDS view
object when the value is not equal to 10, use the following format:
You can also use a parameter in a filter expression in a simple data filter to filter data from an ABAP CDS
view object.
In the example, paramO denotes an optional parameter and paramM denotes a mandatory parameter.
To use an advanced data filter that contains a single condition, select Advanced as the type of filter, and
then enter the field expression in the following format:
For example, to filter data from the BUKRS field in the EKKO table when the value is 1010, use the
following format: ( EKKO.BUKRS = '1010' )
You can also use a parameter in a filter expression in an advanced data filter.
The following image shows the configured advanced data filter that filters data of the BUKRS field from
the EKKO table when the data matches with the defined parameter:
To use an advanced data filter that contains multiple conditions, select Advanced as the type of filter,
and then enter the field expression in the following format:
For example, to filter data from multiple fields in the EKKO table, use the following format that contains a
logical expression: ( EKKO.BUKRS = '1010' AND EKKO.LPONR < '60' AND EKKO.ERNAM <>
'PURCHASER' AND EKKO.BEDAT <= '20160129' ) OR ( EKKO.BUKRS = '1110' )
You can also use a parameter in a filter expression in an advanced data filter.
For example, ( EKKO.BUKRS = $$PARAM AND EKKO.LPONR < $$PARAM1 AND EKKO.ERNAM <>
'PURCHASER' AND EKKO.BEDAT <= $$PARAM2 ) OR ( EKKO.BUKRS = $$PARAM3 ).
To use the $LastRunTime variable in an advanced data filter, select Advanced as the type of filter, and
then enter the field expression in the following format:
For example, to filter data from the BEDAT field in the EKKO table when the data is less than the
LastRunTime variable, use the following format: ( EKKO.BEDAT < $LastRunTime )
The following image shows the configured advanced data filter that filters data from the BEDAT field in
the EKKO table when the data is less than the LastRunTime variable:
To use the SY-DATUM variable in an advanced data filter, select Advanced as the type of filter, and then
enter the field expression in the following format based on the transports you installed:
For example, to filter data from the BEDAT field in the EKKO table when the data is two days older than
the current date and you installed the TABLE_READER transport, use the following format: ( EKKO.BEDAT
= SY-DATUM - 2 )
You can also use the SY-DATUM variable in an advanced data filter when you installed the
TABLE_READER_Addon transport.
Sort options
You can configure the Source transformation to sort data before the data enters the data flow. Use the
source query options to sort source data.
To sort data, configure the source query options on the Source tab of the Source transformation. Expand the
Query Options section, and configure the sort condition.
When you configure a sort condition, you can sort data either in the ascending or descending order from the
field in a table.
To use the sort condition, select the source object, sort by field, and then the sort direction.
For example, to sort data in the ascending order from the EBELN field in the EKKO table, use the following
format: EKKO EBELN Ascending
The following image shows the configured sort condition that sorts data of the EBELN field from the EKKO
table in ascending order:
When you select multiple source objects as the source type, you can configure a relationship to join multiple
source objects.
To configure a relationship using the join condition between the source and the related objects, specify the
key field in the source SAP object, the type of join, the join operator, the related SAP object and the key field
in the related object, and then click ADD.
For example, to configure an inner join to join the EKKO and EKPO tables when the value of the EBELN field of
the EKPO table is less than the value of the EBELN field of the EKKO table, use the following format:
EKKO.EBELN Inner Join < EKPO.EBELN
The following image shows a configured custom relationship that uses an inner join to join the EKKO and
EKPO tables when the value of the EBELN field of the EKPO table is less than the value of the EBELN field of
the EKKO table:
Advanced properties
You can configure additional options in the Advanced Properties field in the Source transformation.
You can use the following properties as advanced properties when you configure delta extraction on SAP
source tables:
• When the external application time zone differs from the SAP system time zone, enter the SAP system
offset time in minutes in the following format:
delta_offset=<SAP system offset time in minutes>
For example, if the difference between the external application time zone and SAP system time zone is
480 minutes, enter the following value:
delta_offset=480
• To fetch the changed data of key fields that are marked for hard deletion in SAP to the target table, enter
the following advanced property:
fetch_del_rows=true
The SAP table and SAP columns for which you want to perform delta extraction must be part of a change
document object in SAP. For more information about creating a change document in SAP, see the SAP
documentation.
The Secure Agent uses the CDHDR (Change Document Header) and CDPOS (Change Document Position)
tables in SAP to extract the changed data. The CDHDR table stores the change document header information.
The CDPOS table stores the new value and the old value of the changed data. The Secure Agent uses the
change document number in the CDHDR table to find the latest change number in the CDPOS table.
When you perform a delta extraction and if the row is available in the SAP target table, the Secure Agent
updates the delta records to the target table. If the row is not available, the Secure Agent inserts the records
to the SAP target table.
If multiple transactions such as insert, update, or delete occurs for the same record in the SAP source, the
Secure Agent fetches only one record. However, if an operation, for example, an insert for a record occurs in
the SAP source table and you run the mapping, the Secure Agent fetches the inserted record. Later, if an
update occurs for the same record in the SAP source table and you run the delta extraction mapping, the
Secure Agent fetches the updated record.
When you perform a delta extraction for the deleted records in the SAP table, the Secure Agent fetches only
the key value of the deleted delta records to the target table.
Specifies the SAP source table name from which data is extracted.
SESSION_ID
Specifies the unique Informatica session ID for delta extraction. The Secure Agent generates a unique
session ID for each mapping run for a particular SAP table.
PREV_CHNG_NUM
CURR_CHNG_NUM
PREV_DAT
Indicates the date from when changes were extracted in the last delta extraction. The Secure Agent uses
this information when you use the Delta Repeat option.
PREV_TIME
Indicates the time from when the changes were extracted in the last delta extraction. The Secure Agent
uses this information when you use the Delta Repeat option.
LAST_UPDATED_DAT
Indicates the date up to when the changes were extracted from the source in the last delta extraction.
This value also indicates the date from when the changes will be extracted for the subsequent delta
extraction.
LAST_UPDATED_TIM
Indicates the time up to when the changes were extracted from the source in the last delta extraction.
This value also indicates the time from when the changes will be extracted for the subsequent delta
extraction.
Use this option when you want to extract all the records from an SAP table instead of reading only the
changed data.
Default is Full.
Use this option when you do not want to extract any data but you want to record the LAST_UPDATED_DAT
and LAST_UPDATED_TIM in the Informatica custom table /INFADI/TBLCHNGN for subsequent delta
extractions.
For example, you have a table named Customers that contains 5 million records. You want to read the initial
set of records by using another product such as Informatica Data Replication and then write the records to a
Teradata data warehouse. You then want to use SAP Connector to read only the new customer records that
get added to the table. In this case, you can configure delta initialization without transfer and then
subsequently run a delta update session to capture the changed data.
Use this option when you want to extract all the records from an SAP table to build an initial set of the data
and subsequently run a delta update session to capture the changed data.
Delta update
If you select the Delta update option, the Secure Agent extracts the changed data since the last data
extraction.
• Extracts records from the columns LAST_UPDATED_DAT AND LAST_UPDATED_TIM to the current date
and time.
• Moves the values from columns LAST_UPDATED_DAT and LAST_UPDATED_TIM to PREV_DAT and
PREV_TIM, respectively.
• Updates the values in columns LAST_UPDATED_DAT and LAST_UPDATED_TIM to the current date and
time.
Use this option when you want to read only the data that changed since the last data extraction.
Note: To avoid data loss, the current date and time is frozen before the Secure Agent runs the query. If any
data enters at the time when the mapping runs, that data is extracted only when you run the next mapping.
Delta repeat
If you select the Delta repeat option, the Secure Agent repeats a previous delta update in case of errors. It
returns records from the PREV_DAT and PREV_TIM values in the Informatica custom table /INFADI/
TBLCHNGN to the LAST_UPDATED_DAT and LAST_UPDATED_TIM values in the Informatica custom table /
INFADI/TBLCHNGN. It does not update any change numbers in the /INFADI/TBLCHNGN table.
Use this option if you encountered errors in a previous delta update and want to repeat the delta update.
Before you run a delta repeat session, you need to perform a delta update.
Parameter
If you select the Parameter option, the Secure Agent uses the update mode value from a parameter file.
Define a parameter name and parameter value in a parameter file. In the SAP table reader mapping that you
create for delta extraction, specify the same parameter name that you defined in the parameter file. Then,
create a mapping task and specify the parameter file name in the task. Instead of updating the parameter
value in the mapping every time, you can update the parameter value in the parameter file and run the
mapping task again.
When you run an SAP Table reader mapping for delta extraction, the Secure Agent fetches changed records
from the SAP table for entries logged for the SAP table name in the Change Document Position (CDPOS)
table. If there are entries logged for the SAP table structure in the CDPOS table, you can fetch those records
by overriding the table name in the Override Table Name for Delta Extraction field in the SAP Table advanced
source properties.
For example, if the table name you specified as the object type in the mapping is CRMD_ORDERADM_H, to get
the delta records for entries captured in the CDPOS table for the structure name, specify the structure name
CRMA_ORDERADM_H in the Override Table Name for Delta Extraction field. The Secure Agent fetches
records from CRMA_ORDERADM_H that has change entries logged for the table structure.
If the delta data captured in the CDPOS table does not include the structure name, keep this field blank.
• You can perform delta extraction only for a single SAP table. You can't use native joins to join data from
two or more SAP tables.
• You can't configure delta extraction to look up data from SAP tables.
• You can't configure delta extraction with partitioning.
• You can't configure delta extraction for multiple pipelines within the same mapping.
• When the Secure Agent performs a delta extraction, it does not retrieve the records in the same order in
which they were inserted or updated in the SAP table. For example, record 10 might have been updated
first in the SAP table before record 20. However, while extracting the data, the Secure Agent might fetch
record 20 first and then record 10.
• The Secure Agent does not print any information in the session log to indicate whether the records
extracted through delta extraction were part of an insert or update operation in SAP.
• During delta extraction, if there are multiple entries for a key between the current and last updated date
and time, the Secure Agent fetches only the latest entry for the key.
1. Create a mapping to read data from an SAP table and write data to a target.
2. Click the Source transformation in the mapping.
3. In the Properties panel, click the Source tab.
4. Under the advanced properties, select one of the following values from the Update Mode list:
• 0 - Full
• 1 - Delta initialization without transfer
• 2 - Delta initialization with transfer
• 3 - Delta update
• 4 - Delta repeat
• Parameter
For information about using a parameter for delta extraction, see “Configuring a parameter file for delta
extraction” on page 37.
5. Save and run the mapping.
The error occurs because the SAP table for which you are trying to perform delta extraction is not part of
a change document object in SAP.
If the SAP table and SAP columns for which you want to perform delta extraction are not part of a
change document object in SAP, you cannot perform delta extraction. You can only perform a full
extraction.
Why do I see the error "An error occurred while fetching the current date and time from SAP because there is no entry
present in the /INFADI/TBLCHNGN table. Run a delta initialization session first."?
The error occurs when you run a delta update or delta repeat session directly without performing a delta
initialization.
The delta initialization records the LAST_UPDATED_DAT and LAST_UPDATED_TIM that the Secure Agent
uses to run a delta update or delta repeat session. Without delta initialization, the Secure Agent does not
have access to the LAST_UPDATED_DAT and LAST_UPDATED_TIM to run a delta update or delta repeat
session.
You can refer to the session log to find out the session ID for your mapping. In the INFADI/TBLCHNGN
table, look for the same session ID to view details about your mapping run. The Secure Agent generates
a unique session ID for each mapping run for a particular SAP table. You can also sort the INFADI/
TBLCHNGN table entries based on the session ID.
Why does the number of records extracted through the Full or Delta initialization with transfer option not match the
number of records extracted through the Delta repeat option?
When you use the Full or Delta initialization with transfer option, the Secure Agent extracts all the
records directly from the SAP table and not only the records that are captured in the change document.
However, when you use the Delta repeat option, the Secure Agent extracts only the records that are
captured in the change document.
Therefore, after you perform a full extraction or delta initialization with transfer, if you run a delta repeat
session, the extracted records count might not match with the number of records extracted through the
Full or Delta initialization with transfer option.
Use key range partitioning for columns that have an even distribution of data values. Otherwise, the partitions
might have unequal size. For example, a column might have 10 rows between key values 1 and 1000 and the
column might have 999 rows between key values 1001 and 2000. If the mapping includes multiple sources,
use the same number of key ranges for each source.
When you define key range partitioning for a column, the Secure Agent reads the rows that are within the
specified partition range. For example, if you configure two partitions for a column with the ranges as 10
through 20 and 30 through 40, the Secure Agent does not read the rows 20 through 30 because these rows
are not within the specified partition range.
You can configure a partition key for fields of the following data types:
• ACCP
• DATS
• INT1
• INT2
• INT4
• NUMC
• TIMS
You cannot use key range partitions when a mapping includes any of the following transformations:
• Web Services
• XML to Relational
Contact your SAP administrator to increase the value of the parameter on the SAP side.
If you configure an uncached lookup, you can use only the = logical operator in the lookup condition.
When you configure an uncached lookup, ensure that the data does not contain the pipe (|) character,
otherwise data corruption occurs.
When you use an SAP Table object as a lookup, you do not need to configure specific SAP Table properties.
1. To create a mapping, click Data Integration > New > Mappings. Select Mapping and click Create.
2. Enter a name and description for the mapping, and click OK.
You can use alphanumeric characters and underscores (_) in the mapping name.
3. To configure a source, on the Transformation palette, click Source.
4. In the Properties panel, on the General tab, enter a name and description.
5. Click the Source tab and configure source details.
6. Specify the source type. You can choose one of the following options:
1. To create a mapping task, click Data Integration > New > Tasks and then complete one of the following
steps:
• To create a mapping task based on a mapping, select Mapping Task and click Create.
• To create a mapping task using a template, expand the appropriate template category and select the
template you want to use, and then click Create.
To edit a mapping task, on the Explore page, navigate to the mapping task. In the row that contains the
task, click Actions and select Edit.
2. Enter a name for the task.
Task names must be unique within the organization. Task names can contain alphanumeric characters,
spaces, and the following special characters:_ . + -Task names are not case sensitive.
3. Select the runtime environment that contains the Secure Agent that you want to use to access the SAP
tables.
4. Select Mapping as the task based on which you want to create the mapping task.
5. Click Select to specify a mapping.
The Select a Mapping dialog box appears.
6. Select a mapping or search for the required mapping and select OK.
The image of the selected mapping appears.
7. Click Next.
If you specified any parameters in the source or target details in the mapping, the Source or Target page
appears. If not, the Schedule page appears.
8. Click Next to configure a schedule and advanced options. Perform any of the following steps based on
your requirements.
a. Click Run this task on schedule and specify the schedule you want to use.
b. Configure the email notification options.
c. Configure advanced options for the task.
d. Configure the advanced source properties and advanced target properties.
e. Specify the execution mode.
9. Optionally, add advanced session properties.
a. Click Add.
b. Select a session property.
c. Configure the value of the session property.
10. Save and run the mapping task.
You can read data from an SAP purchasing document header, the EKKO table, and write the purchasing
details to any target.
In this example to read data from the EKKO table and write the data to a flat file target object, perform the
following steps:
3. Click OK.
7. When you select a CDS view, on the Fields tab, you can view the mandatory and optional parameters:
8. Click Query Options in the Source tab to specify any filter and sort options for the CDS view:
The following image shows the basic filter options configured for the CDS view:
5. On the Advanced tab, do not select the Lookup Caching Enabled checkbox, and then specify the lookup
properties:
You can configure a synchronization task using the Synchronization Task wizard. You can use SAP Table
objects as sources, targets, or lookup objects. You can use expressions to transform the data according to
your business logic, use data filters to filter data before writing it to targets, sort data in ascending or
descending order of multiple fields.
When you create a task, you can associate it with a schedule to run it at specified times or on regular
intervals. Or, you can run it manually. You can monitor tasks that are currently running and view logs about
completed tasks.
The source properties appear on the Source page of the Synchronization Task wizard when you specify an
SAP Table connection.
Property Description
Create Relationship Creates relationship between selected source object and related source object. Specify a
join condition between a source object key field and a related source object key field.
50
Property Description
Display technical field When selected, displays technical names instead of business names of the fields in the
names instead of labels specified source object.
Display source fields in When selected, displays source fields in alphabetic order. By default, fields appear in the
alphabetical order order returned by the source system.
Data Preview Displays the first 10 rows of the first five columns in the object and the total number of
columns in the object.
You can also configure advanced source properties when you schedule the synchronization task. Advanced
source properties appear on the Schedule page of the Synchronization Task wizard.
The following table describes the SAP Table advanced source properties:
Property Description
Number of rows to be fetched The number of rows that are randomly retrieved from the SAP Table. Default value of
zero retrieves all the rows in the table.
If you configure an uncached lookup, you can use only the = logical operator in the lookup condition.
When you configure an uncached lookup, ensure that the data does not contain the pipe (|) character,
otherwise data corruption occurs.
When you use an SAP Table object as a lookup, you do not need to configure specific SAP Table properties.
d. Click Select.
The Data Preview area displays the first 10 rows of the first five columns in the SAP object and the
total number of columns in the object. To preview all source columns in a file, click Preview All
Columns.
In Monitor, you can monitor the status of the logs after you run the task.
You can view the HTTP and HTTPS log files in the SMICM transaction. Optionally, you can increase trace
level to 3 to view the detailed logs.
You can read General Ledger Accounting line items from the BKPF and BSEG tables in SAP. BSEG is an SAP
Cluster table that is used to store Accounting Document Segment information. BKPF is a Transparent SAP
Table that is used to store Accounting Document Header information. In this example, you can join the BKPF
and BSEG tables and map the source object to a flat file target object.
In this example to write the accounting document details to a flat file object, perform the following steps:
5. Click Next.
7. Select a source object to preview the data. The Data Preview area displays the first 10 rows of the first
five columns in the SAP object. You can also view the total number of columns in the object. To preview
all source columns in a file, click Preview All Columns.
8. To display technical names instead of business names, select Display technical field names instead of
labels.
9. To display source fields in alphabetic order, click Display source fields in alphabetical order.
Native data types are data types specific to the source and target databases or flat files. They appear in
non-SAP sources and targets in the mapping.
SAP data types appear in the Fields tab for Source and Target transformations when you choose to edit
metadata for the fields. SAP performs any necessary conversion between the SAP data types and the
native data types of the underlying source database tables.
Set of data types that appear in the remaining transformations. They are internal data types based on
ANSI SQL-92 generic data types, which Data Integration uses to move data across platforms.
Transformation data types appear in all remaining transformations in a mapping, synchronization task,
or mapping task.
When Data Integration reads source data, it converts the native data types to the comparable transformation
data types before transforming the data. When Data Integration writes to a target, it converts the
transformation data types to the comparable native data types.
61
SAP Data Type Transformation Range for Transformation Data Type
Data Type
DATS Date/time Jan 1, 0001 A.D. to Dec 31, 9999 A.D. Precision to the nanosecond.
DF16_DEC Decfloat16 Range of 1-15 and scaling of maximum 14. Decimal floating point number
stored in BCD format.
You can use the DF16_DEC data type when you read data from SAP tables.
DF34_DEC Decfloat34 Range of 1-31 and scaling of maximum 30. Decimal floating point number
stored in BCD format.
You can use the DF34_DEC data type when you read data from SAP tables.
DF16_RAW Double Maximum of 16 positions with floating decimal. Decimal floating point
number stored in binary format.
You can use the DF16_RAW data type when you read data from SAP tables.
DF34_RAW Double Maximum of 34 positions with floating decimal. Decimal floating point
number stored in binary format.
You can use the DF34_RAW data type when you read data from SAP tables.
LRAW Binary Uninterrupted sequence of bytes with a maximum length of 255 positions.
PREC Binary Uninterrupted sequence of bytes with a maximum length of 255 positions.
RAW Binary Uninterrupted sequence of bytes with a maximum length of 255 positions.
TIMS Date/time Jan 1, 0001 A.D. to Dec 31, 9999 A.D. Precision to the nanosecond.
Perform the following steps to configure the custom property for the Secure Agent:
You can't use the SSTRING, STRING or RAWSTRING data type in a task when you write to SAP table at this
time. Tasks that include these data types for the SAP table writer might fail.
Rules and guidelines for SSTRING, STRING, and RAWSTRING data types 63
Chapter 6
For more information about the issue and steps to avoid the CPIC error, see the following Informatica
Knowledge Base article: KB 000176711.
How to solve the following error that occurs when I import metadata for an SAP table at design time:
“OPTION_NOT_VALID: OPTION_NOT_VALID Message 000 of class SAIS type E”
For more information about the issue when you import metadata for an SAP table at design time, see the
following Informatica Knowledge Base article: KB 000174054.
How can I avoid truncation of the tab character when the CHAR data type in an SAP table contains a tab character at the
end?
For more information about the issue and steps to avoid the truncation error, see the following
Informatica Knowledge Base article: KB 000179163.
How can I improve performance of an SAP Table mapping that is configured for delta extraction for a large volume of
data?
For more information about improving performance of a mapping configured for delta extraction, see the
following Informatica Knowledge Base article: KB 000184837.
How to solve the following error that occurs when I process the query to look up or filter data from the SAP Table
object: An exception with the type CX_SY_DYNAMIC_OSQL_SYNTAX occurred, but was neither handled
locally, nor declared in a RAISING clause.
For more information about the issue and steps to avoid the error, see the following Informatica
Knowledge Base article: KB 000185754.
How can I import the SAP metadata that contains a SSTRING, STRING, or RAWSTRING data type with precision that is
not defined in the SAP system?
For more information about importing the SAP metadata that contains a SSTRING, STRING, or
RAWSTRING data type with precision that is not defined in the SAP system, see “Rules and guidelines for
SSTRING, STRING, and RAWSTRING data types” on page 63.
How to solve the following error that occurs when I use an SAP table as a source object in a synchronization task:
Field QUERYRESULT not a member of TABLES
You need to install the latest transport files and clear the browser cache.
64
Index
D target 10
transformations 10
Data Integration community SAP Table mapping example
URL 5 defining the mapping 44
data types SAP Table mappings
SAP 61 configuring 41
status
Informatica Intelligent Cloud Services 6
I Synchronization tasks
example 56
Informatica Global Customer Support monitoring 55
contact information 6 multiple SAP object sources 54
Informatica Intelligent Cloud Services overview 50
web site 5 rules and guidelines for SAP sources and targets 8
SAP Table lookups 51
SAP Table sources 50
P
Partitioning
key range partitioning for SAP table sources 40
W
web site 5
S
SAP
data types 61
65