CIH July2021 CloudIntegrationHub en
CIH July2021 CloudIntegrationHub en
July 2021
This software and documentation contain proprietary information of Informatica LLC and are provided under a license agreement containing restrictions on use and
disclosure and are also protected by copyright law. Reverse engineering of the software is prohibited. No part of this document may be reproduced or transmitted in any
form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC. This Software may be protected by U.S. and/or
international Patents and other Patents Pending.
Use, duplication, or disclosure of the Software by the U.S. Government is subject to the restrictions set forth in the applicable software license agreement and as
provided in DFARS 227.7202-1(a) and 227.7702-3(a) (1995), DFARS 252.227-7013©(1)(ii) (OCT 1988), FAR 12.212(a) (1995), FAR 52.227-19, or FAR 52.227-14 (ALT III),
as applicable.
The information in this product or documentation is subject to change without notice. If you find any problems in this product or documentation, please report them to
us in writing.
Informatica, Informatica Platform, Informatica Data Services, PowerCenter, PowerCenterRT, PowerCenter Connect, PowerCenter Data Analyzer, PowerExchange,
PowerMart, Metadata Manager, Informatica Data Quality, Informatica Data Explorer, Informatica B2B Data Transformation, Informatica B2B Data Exchange Informatica
On Demand, Informatica Identity Resolution, Informatica Application Information Lifecycle Management, Informatica Complex Event Processing, Ultra Messaging,
Informatica Master Data Management, and Live Data Map are trademarks or registered trademarks of Informatica LLC in the United States and in jurisdictions
throughout the world. All other company and product names may be trade names or trademarks of their respective owners.
Portions of this software and/or documentation are subject to copyright held by third parties, including without limitation: Copyright DataDirect Technologies. All rights
reserved. Copyright © Sun Microsystems. All rights reserved. Copyright © RSA Security Inc. All Rights Reserved. Copyright © Ordinal Technology Corp. All rights
reserved. Copyright © Aandacht c.v. All rights reserved. Copyright Genivia, Inc. All rights reserved. Copyright Isomorphic Software. All rights reserved. Copyright © Meta
Integration Technology, Inc. All rights reserved. Copyright © Intalio. All rights reserved. Copyright © Oracle. All rights reserved. Copyright © Adobe Systems Incorporated.
All rights reserved. Copyright © DataArt, Inc. All rights reserved. Copyright © ComponentSource. All rights reserved. Copyright © Microsoft Corporation. All rights
reserved. Copyright © Rogue Wave Software, Inc. All rights reserved. Copyright © Teradata Corporation. All rights reserved. Copyright © Yahoo! Inc. All rights reserved.
Copyright © Glyph & Cog, LLC. All rights reserved. Copyright © Thinkmap, Inc. All rights reserved. Copyright © Clearpace Software Limited. All rights reserved. Copyright
© Information Builders, Inc. All rights reserved. Copyright © OSS Nokalva, Inc. All rights reserved. Copyright Edifecs, Inc. All rights reserved. Copyright Cleo
Communications, Inc. All rights reserved. Copyright © International Organization for Standardization 1986. All rights reserved. Copyright © ej-technologies GmbH. All
rights reserved. Copyright © Jaspersoft Corporation. All rights reserved. Copyright © International Business Machines Corporation. All rights reserved. Copyright ©
yWorks GmbH. All rights reserved. Copyright © Lucent Technologies. All rights reserved. Copyright © University of Toronto. All rights reserved. Copyright © Daniel
Veillard. All rights reserved. Copyright © Unicode, Inc. Copyright IBM Corp. All rights reserved. Copyright © MicroQuill Software Publishing, Inc. All rights reserved.
Copyright © PassMark Software Pty Ltd. All rights reserved. Copyright © LogiXML, Inc. All rights reserved. Copyright © 2003-2010 Lorenzi Davide, All rights reserved.
Copyright © Red Hat, Inc. All rights reserved. Copyright © The Board of Trustees of the Leland Stanford Junior University. All rights reserved. Copyright © EMC
Corporation. All rights reserved. Copyright © Flexera Software. All rights reserved. Copyright © Jinfonet Software. All rights reserved. Copyright © Apple Inc. All rights
reserved. Copyright © Telerik Inc. All rights reserved. Copyright © BEA Systems. All rights reserved. Copyright © PDFlib GmbH. All rights reserved. Copyright ©
Orientation in Objects GmbH. All rights reserved. Copyright © Tanuki Software, Ltd. All rights reserved. Copyright © Ricebridge. All rights reserved. Copyright © Sencha,
Inc. All rights reserved. Copyright © Scalable Systems, Inc. All rights reserved. Copyright © jQWidgets. All rights reserved. Copyright © Tableau Software, Inc. All rights
reserved. Copyright© MaxMind, Inc. All Rights Reserved. Copyright © TMate Software s.r.o. All rights reserved. Copyright © MapR Technologies Inc. All rights reserved.
Copyright © Amazon Corporate LLC. All rights reserved. Copyright © Highsoft. All rights reserved. Copyright © Python Software Foundation. All rights reserved.
Copyright © BeOpen.com. All rights reserved. Copyright © CNRI. All rights reserved.
This product includes software developed by the Apache Software Foundation (http://www.apache.org/), and/or other software which is licensed under various
versions of the Apache License (the "License"). You may obtain a copy of these Licenses at http://www.apache.org/licenses/. Unless required by applicable law or
agreed to in writing, software distributed under these Licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express
or implied. See the Licenses for the specific language governing permissions and limitations under the Licenses.
This product includes software which was developed by Mozilla (http://www.mozilla.org/), software copyright The JBoss Group, LLC, all rights reserved; software
copyright © 1999-2006 by Bruno Lowagie and Paulo Soares and other software which is licensed under various versions of the GNU Lesser General Public License
Agreement, which may be found at http:// www.gnu.org/licenses/lgpl.html. The materials are provided free of charge by Informatica, "as-is", without warranty of any
kind, either express or implied, including but not limited to the implied warranties of merchantability and fitness for a particular purpose.
The product includes ACE(TM) and TAO(TM) software copyrighted by Douglas C. Schmidt and his research group at Washington University, University of California,
Irvine, and Vanderbilt University, Copyright (©) 1993-2006, all rights reserved.
This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit (copyright The OpenSSL Project. All Rights Reserved) and
redistribution of this software is subject to terms available at http://www.openssl.org and http://www.openssl.org/source/license.html.
This product includes Curl software which is Copyright 1996-2013, Daniel Stenberg, <[email protected]>. All Rights Reserved. Permissions and limitations regarding this
software are subject to terms available at http://curl.haxx.se/docs/copyright.html. Permission to use, copy, modify, and distribute this software for any purpose with or
without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies.
The product includes software copyright 2001-2005 (©) MetaStuff, Ltd. All Rights Reserved. Permissions and limitations regarding this software are subject to terms
available at http://www.dom4j.org/ license.html.
The product includes software copyright © 2004-2007, The Dojo Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to
terms available at http://dojotoolkit.org/license.
This product includes ICU software which is copyright International Business Machines Corporation and others. All rights reserved. Permissions and limitations
regarding this software are subject to terms available at http://source.icu-project.org/repos/icu/icu/trunk/license.html.
This product includes software copyright © 1996-2006 Per Bothner. All rights reserved. Your right to use such materials is set forth in the license which may be found at
http:// www.gnu.org/software/ kawa/Software-License.html.
This product includes OSSP UUID software which is Copyright © 2002 Ralf S. Engelschall, Copyright © 2002 The OSSP Project Copyright © 2002 Cable & Wireless
Deutschland. Permissions and limitations regarding this software are subject to terms available at http://www.opensource.org/licenses/mit-license.php.
This product includes software developed by Boost (http://www.boost.org/) or under the Boost software license. Permissions and limitations regarding this software
are subject to terms available at http:/ /www.boost.org/LICENSE_1_0.txt.
This product includes software copyright © 1997-2007 University of Cambridge. Permissions and limitations regarding this software are subject to terms available at
http:// www.pcre.org/license.txt.
This product includes software copyright © 2007 The Eclipse Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms
available at http:// www.eclipse.org/org/documents/epl-v10.php and at http://www.eclipse.org/org/documents/edl-v10.php.
This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?License, http://
www.stlport.org/doc/ license.html, http://asm.ow2.org/license.html, http://www.cryptix.org/LICENSE.TXT, http://hsqldb.org/web/hsqlLicense.html, http://
httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt , http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/
release/license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/OpenSourceLicense.html, http://fusesource.com/downloads/
license-agreements/fuse-message-broker-v-5-3- license-agreement; http://antlr.org/license.html; http://aopalliance.sourceforge.net/; http://www.bouncycastle.org/
licence.html; http://www.jgraph.com/jgraphdownload.html; http://www.jcraft.com/jsch/LICENSE.txt; http://jotm.objectweb.org/bsd_license.html; . http://www.w3.org/
Consortium/Legal/2002/copyright-software-20021231; http://www.slf4j.org/license.html; http://nanoxml.sourceforge.net/orig/copyright.html; http://www.json.org/
license.html; http://forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/
software/tcltk/license.html, http://www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html, http://www.slf4j.org/license.html; http://www.iodbc.org/dataspace/
iodbc/wiki/iODBC/License; http://www.keplerproject.org/md5/license.html; http://www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/
index.html; http://www.net-snmp.org/about/license.html; http://www.openmdx.org/#FAQ; http://www.php.net/license/3_01.txt; http://srp.stanford.edu/license.txt;
http://www.schneier.com/blowfish.html; http://www.jmock.org/license.html; http://xsom.java.net; http://benalman.com/about/license/; https://github.com/CreateJS/
EaselJS/blob/master/src/easeljs/display/Bitmap.js; http://www.h2database.com/html/license.html#summary; http://jsoncpp.sourceforge.net/LICENSE; http://
jdbc.postgresql.org/license.html; http://protobuf.googlecode.com/svn/trunk/src/google/protobuf/descriptor.proto; https://github.com/rantav/hector/blob/master/
LICENSE; http://web.mit.edu/Kerberos/krb5-current/doc/mitK5license.html; http://jibx.sourceforge.net/jibx-license.html; https://github.com/lyokato/libgeohash/blob/
master/LICENSE; https://github.com/hjiang/jsonxx/blob/master/LICENSE; https://code.google.com/p/lz4/; https://github.com/jedisct1/libsodium/blob/master/
LICENSE; http://one-jar.sourceforge.net/index.php?page=documents&file=license; https://github.com/EsotericSoftware/kryo/blob/master/license.txt; http://www.scala-
lang.org/license.html; https://github.com/tinkerpop/blueprints/blob/master/LICENSE.txt; http://gee.cs.oswego.edu/dl/classes/EDU/oswego/cs/dl/util/concurrent/
intro.html; https://aws.amazon.com/asl/; https://github.com/twbs/bootstrap/blob/master/LICENSE; https://sourceforge.net/p/xmlunit/code/HEAD/tree/trunk/
LICENSE.txt; https://github.com/documentcloud/underscore-contrib/blob/master/LICENSE, and https://github.com/apache/hbase/blob/master/LICENSE.txt.
This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and
Distribution License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary
Code License Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php), the new BSD License (http://
opensource.org/licenses/BSD-3-Clause), the MIT License (http://www.opensource.org/licenses/mit-license.php), the Artistic License (http://www.opensource.org/
licenses/artistic-license-1.0) and the Initial Developer’s Public License Version 1.0 (http://www.firebirdsql.org/en/initial-developer-s-public-license-version-1-0/).
This product includes software copyright © 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this
software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab.
For further information please visit http://www.extreme.indiana.edu/.
This product includes software Copyright (c) 2013 Frank Balluffi and Markus Moeller. All rights reserved. Permissions and limitations regarding this software are subject
to terms of the MIT license.
DISCLAIMER: Informatica LLC provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied
warranties of noninfringement, merchantability, or use for a particular purpose. Informatica LLC does not warrant that this software or documentation is error free. The
information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation
is subject to change at any time without notice.
NOTICES
This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software
Corporation ("DataDirect") which are subject to the following terms and conditions:
1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES
OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT, BREACH
OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS.
4 Table of Contents
System Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Chapter 4: Applications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Application management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Creating an Application. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Adding a publication or a subscription to an existing application. . . . . . . . . . . . . . . . . . . . 45
Application properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Chapter 5: Topics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Topic structure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Create topic tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Using metadata files to create topic tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Topic structure updates. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Topic data retention. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Topic management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Creating a topic. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Subscribing to a topic. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Topic properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Topic Diagram. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
General Details properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Table of Contents 5
Topic Structure properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Publications properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Subscriptions properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Chapter 7: Publications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Publication types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Publication processes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Publication process for publications that trigger Data Integration tasks. . . . . . . . . . . . . . . . 69
Publication process for publications that publish data with an API. . . . . . . . . . . . . . . . . . . 70
Publication mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Publication sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Publication schedules. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Publication management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Creating a publication that triggers a Data Integration task. . . . . . . . . . . . . . . . . . . . . . . . 71
Creating a publication that publishes data with an API. . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Running a publication manually. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Disabling and enabling a publication. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Publication properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Chapter 8: Subscriptions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
Subscription types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
Subscription processes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
Subscription process for subscriptions that trigger Data Integration tasks. . . . . . . . . . . . . . 77
Subscription process for subscriptions that consume data with an API. . . . . . . . . . . . . . . . 77
Subscription mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Subscription targets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Subscription schedules. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Subscription retry policy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Subscription management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
Creating a subscription that triggers a Data Integration task. . . . . . . . . . . . . . . . . . . . . . . 79
Creating a subscription that consumes data with an API. . . . . . . . . . . . . . . . . . . . . . . . . . 81
Running a subscription manually. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
6 Table of Contents
Getting previous publications for a subscription. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Disabling and enabling a subscription. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Subscription properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Table of Contents 7
Chapter 11: Glossary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
8 Table of Contents
Preface
Use Cloud Integration Hub to learn how to create and manage Cloud Integration Hub assets, including
applications, topics, publications, and subscriptions. Learn how to perform administrative tasks such as
organization management and asset migration, and how to track and monitor Cloud Integration Hub events.
Informatica Resources
Informatica provides you with a range of product resources through the Informatica Network and other online
portals. Use the resources to get the most from your Informatica products and solutions and to learn from
other Informatica users and subject matter experts.
Informatica Network
The Informatica Network is the gateway to many resources, including the Informatica Knowledge Base and
Informatica Global Customer Support. To enter the Informatica Network, visit
https://network.informatica.com.
To search the Knowledge Base, visit https://search.informatica.com. If you have questions, comments, or
ideas about the Knowledge Base, contact the Informatica Knowledge Base team at
[email protected].
Informatica Documentation
Use the Informatica Documentation Portal to explore an extensive library of documentation for current and
recent product releases. To explore the Documentation Portal, visit https://docs.informatica.com.
If you have questions, comments, or ideas about the product documentation, contact the Informatica
Documentation team at [email protected].
9
Informatica Product Availability Matrices
Product Availability Matrices (PAMs) indicate the versions of the operating systems, databases, and types of
data sources and targets that a product release supports. You can browse the Informatica PAMs at
https://network.informatica.com/community/informatica-network/product-availability-matrices.
Informatica Velocity
Informatica Velocity is a collection of tips and best practices developed by Informatica Professional Services
and based on real-world experiences from hundreds of data management projects. Informatica Velocity
represents the collective knowledge of Informatica consultants who work with organizations around the
world to plan, develop, deploy, and maintain successful data management solutions.
You can find Informatica Velocity resources at http://velocity.informatica.com. If you have questions,
comments, or ideas about Informatica Velocity, contact Informatica Professional Services at
[email protected].
Informatica Marketplace
The Informatica Marketplace is a forum where you can find solutions that extend and enhance your
Informatica implementations. Leverage any of the hundreds of solutions from Informatica developers and
partners on the Marketplace to improve your productivity and speed up time to implementation on your
projects. You can find the Informatica Marketplace at https://marketplace.informatica.com.
To find your local Informatica Global Customer Support telephone number, visit the Informatica website at
the following link:
https://www.informatica.com/services-and-training/customer-success-services/contact-us.html.
To find online support resources on the Informatica Network, visit https://network.informatica.com and
select the eSupport option.
10 Preface
Chapter 1
To publish data to Cloud Integration Hub, first define the data set that you want to manage, for example,
sales, customers, or orders. You define a data set by defining a topic. A topic defines the structure of the
data that Cloud Integration Hub stores in the publication repository and the type of publication repository
where data is stored. You can manage multiple topics that represent different data sets in Cloud Integration
Hub. Applications publish data to topics and subscribe to data sets that are represented by topics.
Multiple applications can publish to the same topic, for example, different ordering applications can publish
their orders to the same Orders topic. Multiple subscribers can consume the data from a topic. Different
subscribing applications can consume the data in different formats and in different latencies based on a
defined schedule.
Cloud Integration Hub stores the data that applications publish to topics in the Cloud Integration Hub
publication repository in the following ways:
• For each publication instance, the retention period for consumed data starts if all the subscribers have
either successfully consumed or discarded the data. That is, after all the events that are associated with
the publication instance are either in a Complete or in a Discarded event status. If all the subscribers
consume or discard the data, Cloud Integration Hub stores the consumed data in the publication
repository until the retention period for consumed data expires, and then deletes the consumed data from
the publication repository.
• Cloud Integration Hub stores unconsumed data in the publication repository until the retention period for
unconsumed data expires, and then deletes the unconsumed data from the publication repository.
Applications can use PowerExchange® adapters and Informatica Intelligent Cloud Services℠ connectors to
share data from different sources, such as database tables, files, or any sources that Informatica supports.
Each application can be a publisher and a subscriber to different topics.
Publications publish to a specific topic. A publication defines the data source type and the location from
where Cloud Integration Hub retrieves the data that the application publishes. Subscriptions subscribe to one
or more topics. A subscription defines the data target type and the location in the subscribing application to
where Cloud Integration Hub sends the published data.
Examples
Your organization uses multiple applications. Some of the applications are located on-premises and some
are located on the cloud. Your applications require the following data:
Marketing application
Requires data about campaigns, accounts, contracts, and employees for operational purposes.
11
Data warehouse
Requires data about sales department employees, including sales representatives, for operational
purposes.
With Cloud Integration Hub, you can address the following use-cases:
You can share the daily account updates from the CRM application with the marketing application, as
follows:
You can share the campaign details from the CRM application with the marketing, data warehouse, and
CRM applications at varying schedules, as follows:
You can share the weekly contract details from the CRM application with the marketing and data
warehouse applications, as follows:
You can share the daily order updates from the CRM application with the marketing application, as
follows:
You can share the monthly employee details from the HR application with the CRM application, as
follows:
You can select to host the publication repository for the organization on-premises or on a private cloud. In
that case, the repository is not hosted on Informatica Intelligent Cloud Services Hosting Services but is
installed and managed by the organization.
User interface to manage applications, topics, publications, and subscriptions, and to monitor
publications, subscriptions, and events. Administrators use the Web client to create the organization in
Cloud Integration Hub.
User interfaces to define sources and targets and to create connections, mappings, and tasks.
Services that host the Cloud Integration Hub service and repositories. The services stores all task and
organization information.
A service that manages publication and subscription processing in Cloud Integration Hub.
Database that stores metadata and runtime data for Cloud Integration Hub applications, topics,
publications, subscriptions, and events.
Publication repository
Database that stores published data until the retention period for the data expires. You can use a hosted
publication repository or a private repository.
Sources and targets that you use to publish and consume data. You can use the following types of
sources and targets:
System Requirements
The following table describes the minimum system requirements for Cloud Integration Hub.
Verify that the system meets the requirements that are applicable for the setup of the organization.
Access via a proxy The following URL is accessible from the machine where the Secure Agent is installed:
gateway https://<pod>.<baseUrl>/
Where:
- <pod> is the name of the Informatica Intelligent Cloud Services point of delivery (PoD)
where you access Cloud Integration Hub. For example: cih-pod1, or emw1-cih.
- <baseUrl> is the Informatica Intelligent Cloud Services URL. For example: dm-
us.informaticacloud.com/.
For example:
https://cih-pod1.dm-us.informaticacloud.com/
Tip: You can copy the values of <pod> and <baseUrl> from the Cloud Integration Hub URL
after you access it from the My Services page of Informatica Intelligent Cloud Services.
Informatica recommends that you add the URL to the whitelist of the proxy server.
Cloud Integration Hub You can use one of the following database systems:
private publication - Oracle
repository - Microsoft SQL Server
- MySQL
Note: For more information about supported editions and versions, see the Product
Availability Matrix (PAM).
The Cloud Integration Hub private publication repository requires at least 512 MB of disk
space for the publication repository database based on the number of publications and
publication instances that you need to retain.
Note: Unicode data requires twice as much storage than single-byte character sets.
Multiple database connections for the private publication repository must always be
available. The number of required connections depends on the number of publications and
subscriptions that run concurrently. Use the following formula to calculate the number of
required database connections:
NumberOf ConcurrentPublicationsOrSubscriptions X 3 + 2
If you do not have enough database connections available, Cloud Integration Hub might fail
or encounter database deadlocks.
For more information about system requirements, see the Product Availability Matrix (PAM) for Informatica
Intelligent Cloud Services. PAMs indicate the versions of operating systems, databases, and other types of
data sources and targets that a product release supports. You can access the PAMs on the Informatica
Network at https://network.informatica.com/community/informatica-network/product-availability-matrices/.
Use the navigator to create assets, track events, and explore and perform actions on existing assets.
The Hub Overview diagram provides a visual overview of the existing assets. Use the View filter to filter the
assets that the Hub Overview diagram shows.
If you use a hosted publication repository, the repository storage usage shows at the top right of the Hub
Overview diagram.
The Hub Overview diagram provides a visual overview of the existing assets, grouped into categories.
When you rest on an asset in the diagram, all related assets are highlighted. For example, when you rest on a
topic, the applications and the publications that publish to the topic and the subscriptions that subscribe to
the topic are highlighted. When you click an asset, a drill down view of the asset and its relations to other
assets appears. For example, when you click a publication, the drill down view shows the publishing
application, the topic to which the publication publishes data, and the subscriptions that subscribe to the
topic.
When you right-click an asset in the drill down view, an action menu opens. You can perform the following
actions from the menu, based on the asset type:
Filters
You can filter the Hub Overview diagram to the following views:
The following table lists the navigator icons and describes the functions that they perform:
New Create a new asset: application, publication, subscription, topic, or monitoring rule.
Explore page
Use the Explore page to work with your Informatica Intelligent Cloud Services projects and assets.
• Explore by projects and folders. Vew all projects or select a particular project.
• Explore by asset types. View all assets or view assets of a particular type.
• Explore by tags. View assets associated with a particular tag.
• Search for projects or assets. To search all projects, folders, and assets in the organization, view the
Explore page by All Projects, and then enter a name or description in the Find box. Or, to narrow your
search, filter the Explore page by All Assets, select a specific asset type, project, or folder, and then enter
a name or description in the Find box.
• Sort the search results. Sort the Explore page by name, last update date, description, or type. When you
sort by type, the Explore page groups assets by asset type. It does not list the asset types in alphabetical
order.
You can see projects, folders, and assets for all of the services that you use. If you select an asset to open it
or perform an action and the asset is created in a different service than the one you have open, the service
opens in a new browser tab.
Note:
Before you access Cloud Integration Hub for the first time, the administrator sets up the organization in
Informatica Intelligent Cloud Services and then sets up the organization in Cloud Integration Hub. If, when
you access Cloud Integration Hub for the first time, the Organization Cloud Setup dialog box shows, it is an
indication that your administrator did not perform the process of provisioning the organization to the hub.
Contact your administrator or follow the instructions on the screen. For details, see “Organization
management” on page 22.
1. On the Informatica Intelligent Cloud Services login page, enter your Informatica Intelligent Cloud
Services user name and password.
2. Click Log In.
The Informatica Intelligent Cloud Services My Services page appears.
3. Select Integration Hub.
The Cloud Integration Hub application appears.
Note: The Integration Hub link appears on the My Services page if your organization has the required
licences. If the link doesn't appear on the My Services page, contact your administrator.
For example, a Sales topic that represents sales data. Applications from all the stores in the organization
publish sales data to the Sales topic. The accounting application subscribes to the Sales topic and consumes
published sales data from all stores, or, if a filter is applied, from specific stores.
Publication repository
Cloud Integration Hub stores topic data in a publication repository, in a structure that represents the
structure in which you want to keep the data.
The publication repository stores the data for a short intermediate period after the data is consumed by all
subscribers.
Cloud Integration Hub stores the data in the publication repository in the following ways:
• For each publication instance, the retention period for consumed data starts if all the subscribers have
either successfully consumed or discarded the data. That is, after all the events that are associated with
the publication instance are either in a Complete or in a Discarded event status. If all the subscribers
consume or discard the data, Cloud Integration Hub stores the consumed data in the publication
repository until the retention period for consumed data expires, and then deletes the consumed data from
the publication repository.
• Cloud Integration Hub stores unconsumed data in the publication repository until the retention period for
unconsumed data expires, and then deletes the unconsumed data from the publication repository.
You can use a hosted publication repository or a private publication repository in Cloud Integration Hub.
Cloud Integration Hub hosts and manages the publication repository on Informatica Intelligent Cloud
Services Hosting Services. Storage usage of the repository shows on the Cloud Integration Hub home
page.
Use your own, private repository. A private publication repository can reside on-premises or on the
organization's private cloud. For more information about setting up a private publication repository, see
“Set up a private publication repository” on page 27.
You develop Data Integration tasks for Cloud Integration Hub in the same way that you develop other Data
Integration tasks. You use the Cloud Integration Hub connection as the target in publication tasks and as the
source in subscription tasks.
Publication repository 19
Cloud Integration Hub Publications and
Subscriptions
Publications and subscriptions are entities that define how applications publish data to Cloud Integration
Hub and how applications consume data from Cloud Integration Hub. Publications publish data to a defined
topic and subscriptions subscribe to topics.
Publications and subscriptions control the data flow and the schedule of data publication or data
consumption. An application can be a publisher and a subscriber. Multiple applications can publish to the
same topic. Multiple applications can consume data from the same topic.
Publications and subscriptions can publish from and subscribe to any type of source and target that
Informatica Intelligent Cloud Services supports. You can publish from and subscribe to different sources of
data. Because the publishing process and the consuming process are completely decoupled, the publishing
source and the consuming target do not have to be of the same data type. For example, you can publish data
from a file and consume it into a database.
Publications and subscriptions can publish and consume data by triggering a Data Integration task or with an
API. For publications and subscriptions that trigger a Data Integration task, you create the tasks in
Informatica Intelligent Cloud Services. You then select a task when you create the publication or subscription
in Cloud Integration Hub. For publications and subscriptions that are triggered by an API, you run the API
manually.
When data transfer is complete, the topic data set is ready for subscribers. The subscription process starts
when one of the following conditions exist, based on the configuration of data consumption in the
subscriptions:
Cloud Integration Hub generates events to track the progress of the publication and subscription process.
When an application publishes data, Cloud Integration Hub creates a parent publication event. When the
publication process ends and the published data is ready to consume, Cloud Integration Hub generates a
child event for each subscription.
The events change status as the publication and subscription process progresses, and reach a completed
status after the process ends successfully. You also use events to monitor and troubleshoot issues that
might occur during the process.
During the publication or the subscription process, Cloud Integration Hub communicates with Informatica
Intelligent Cloud Services, going through the following stages:
• When a cloud application publishes a data set, the Cloud Integration Hub server triggers the Data
Integration task that is defined for the publication through an Informatica Intelligent Cloud Services REST
API.
• For cloud publications, the target is defined using a Cloud Integration Hub cloud connector. The
publication process uses the connector to write the data to Cloud Integration Hub.
Hub administration
Before the organization can use Cloud Integration Hub, you must set up an organization in Informatica
Intelligent Cloud Services and then set up the organization in Cloud Integration Hub.
After you set up the organization in Informatica Intelligent Cloud Services, you can perform one or more of
the following tasks:
• Deploy the Cloud Integration Hub Salesforce Accelerator package for rapid synchronization of data from
Salesforce to other applications through Cloud Integration Hub. Deploying the package creates the
components that are required to connect the Salesforce application to Cloud Integration Hub. Some of the
components are created in Cloud Integration Hub and some are created in Informatica Intelligent Cloud
Services.
• Set up a private publication repository to store topic data.
• Modify the policy for writing data to intermediate staging in subscription flows.
• To view Data Integration Hub publication and subscription events in Cloud Integration Hub, configure
Cloud Integration Hub system properties.
• Configure an external load balancer URL as the base API URL of publications and subscriptions that
publish and consume data with an API to a private publication repository.
Organization management
Before the organization can use Cloud Integration Hub, you must set up an organization in Informatica
Intelligent Cloud Services and then set up the organization in Cloud Integration Hub.
When you set up the organization in Cloud Integration Hub, Cloud Integration Hub creates the connection
Cloud Integration Hub in Informatica Intelligent Cloud Services.
Warning: Do not rename the connection. The only connection property that you can change is the option Do
not use intermediate staging for subscription flows. For more details, see “Intermediate staging policy for
subscriptions” on page 28.
Editing other connection properties or renaming the connection might result in errors at run time.
If you select to use a hosted publication repository, Cloud Integration Hub creates the Cloud Integration Hub
publication repository on Informatica Intelligent Cloud Services Hosting Services.
22
Before you begin
Before you set up the organization in Cloud Integration Hub verify that the following conditions exist in
Informatica Intelligent Cloud Services.
Configuration
From the Configure menu, under Runtime Environments, verify that the Secure Agent is running.
Administration
From the Administer menu, under Licenses, verify that following conditions exist:
REST API license Maximum Concurrent Sessions is set to a high value, for example, 100 sessions.
Proxy Settings
If your organization uses an outgoing proxy server to connect to the internet, set the following JVM options
on the Secure Agent:
Name Value
JVMOption3 -Dhttp.useProxy=true
After the Secure Agent restarts, check the agent core log file to verify that the correct proxy server is used.
The agent core log file is the following file:
<Secure Agent installation directory>\apps\agentcore\agentcore.log
To find the proxy information, search for "proxy" in the log file.
Before you can set up an organization in Cloud Integration Hub you must set up the organization in
Informatica Intelligent Cloud Services. For details about setting up an organization, see the Informatica
Intelligent Cloud Services Administrator help.
Organization management 23
2. Define the required settings and then click Save.
Property Description
Organization Name of the organization in Informatica Intelligent Cloud Services. Appears in view only
Name mode.
Organization ID ID of the organization in Informatica Intelligent Cloud Services. Appears in view only mode.
Informatica Cloud Name of the Informatica Intelligent Cloud Services user to use at run time. The user must
User have an Admin user role in Informatica Intelligent Cloud Services.
Informatica Cloud Password for the Informatica Intelligent Cloud Services user to use at run time.
Password
Runtime Informatica Intelligent Cloud Services Secure Agent runtime environment to use at run time.
Environment
Organization Database that stores published data until the retention period for the data expires. Choose a
Publication hosted or private publication repository.
Repository If you choose a private publication repository, enter the following parameters:
- Repository Type. Choose Oracle or a Microsoft SQL Server database.
- Repository URL. JDBC URL of the repository, based on the database type:
- Oracle: jdbc:informatica:oracle://<ip>:<port>;sid=<sid>;
- Microsoft SQL Server: jdbc:informatica:sqlserver://<ip>:<port>;
DatabaseName=<DatabaseName>;
- User. Name of the user to access the repository.
- User Role. Role granted to the user to access the repository, based on the database type:
- On an Oracle database, the user must be granted CONNECT and RESOURCE roles.
- On a Microsoft SQL Server database, the user must be granted db_datareader,
db_datawriter, and db_ddladmin roles, and you might want to grant the user the
db_owner role.
- Password. Password of the user.
- Database Name. If you use a Microsoft SQL Server database, name of the database.
- Repository Schema. If you use an Oracle database, schema used with the repository.
Rotate Key Click Rotate Key to rotate the encryption key used for data encryption.
Warning: When you set up the organization in Cloud Integration Hub, Cloud Integration Hub creates the
connection Cloud Integration Hub in the organization in Informatica Intelligent Cloud Services. Do not
rename or edit this connection. Editing the connection or changing the connection name might result in
errors at run time.
The package includes components required to connect the Salesforce application to Cloud Integration Hub,
including the following components:
After you deploy the package, you can use the Salesforce Accelerator components to publish the Contacts,
Accounts, and Opportunities tables from Salesforce to the topic in the hub and use the sample subscribing
application to consume the data and write it to a file.
In addition, verify that the organization's Saleforce cloud application includes the tables Accounts, Contacts,
and Opportunities, and that the Cloud Integration Hub user has privileges to read the tables.
mct_CIH_sub_Account_Contact_Opportunity A mapping task that consumes data from the Cloud Informatica
Integration Hub Salesforce topic to the flat file Intelligent Cloud
connection. Services
Note: If any of the Salesforce Accelerator package components exist in Informatica Intelligent Cloud Services
or in Cloud Integration Hub, the deploy operation fails.
1. Click the Salesforce Accelerator link in the upper right corner of the screen.
2. Click Yes in the confirmation message.
Database
A private publication repository must reside on an Oracle, Microsoft SQL Server, or MySQL database. The
repository must be accessible through the Informatica Intelligent Cloud Services Secure Agent. To
optimize performance, set up the Secure Agent and the private repository on the same machine.
Verify that you have the user names and passwords for the required database user accounts that you
create. The database user accounts must have privileges to perform the following actions:
- Views
- Synonyms
- Indexes
- Triggers
• Create, change, delete, and run stored procedures and functions.
If you use a Microsoft SQL Server database, consider granting database owner privileges to the database
user accounts.
Language support
To support UTF-8 character encoding on Oracle Database, configure the database to use the following
character set: <AMERICAN_AMERICA.AL32UTF8>.
Configure the following operating system settings of the Secure Agent machine:
When you use a private publication repository, by default, Cloud Integration Hub writes published data to the
publication repository and reads data from the publication repository through the PRS.
To configure Cloud Integration Hub to bypass the PRS in publication and subscription flows, select the option
Use JDBC for Private Publication Repository in the Cloud Integration Hub connection.
Warning:
• Do not edit any of the other connection properties unless you are instructed to do so when performing
other tasks.
By default, the port number of the publication repository service is 19443. You can change the port number.
1. In Administrator, select Runtime Environments, and then, on the Runtime Environments page, click the
name of the Secure Agent that Cloud Integration Hub uses at run time.
Note: You might have to expand the Secure Agent group to see the list of Secure Agents within the
group.
2. On the Details tab, in the upper right corner, click Edit.
3. In the System Configuration Details area, select CIH Processor.
4. Click the Edit Agent Configuration icon next to api-port and enter the port number.
5. Click Save.
You can assign a different keystore to use with the publication repository.
For performance tuning purposes, when the application consumes the data from the publication repository,
Cloud Integration Hub writes the data to a local folder and then writes the data to the target location.
You can disable writing to intermediate staging on the local server in the Cloud Integration Hub connection.
When intermediate staging is not used, the Data Integration task reads the data from Cloud Integration Hub
and then writes the data directly to the target location. Disabling writing to intermediate staging might affect
system performance.
To disable writing to intermediate staging, select the option Do not use intermediate staging for subscription
flows in the Cloud Integration Hub connection.
Warning:
• Do not edit any of the other connection properties unless you are instructed to do so when performing
other tasks.
• Do not rename the connection.
Editing connection properties unnecessarily or renaming the connection might result in errors at run time.
dih.console.username Enter the user name of the user account of the Data Integration Hub
console.
dih.console.password Enter the password of the user account of the Data Integration Hub
console.
Data Integration Hub events show on the Cloud Integration Hub Events page.
If the load balancer system property is not configured, publications and subscriptions that publish and
consume data with an API use the first agent URL as the base API URL.
System Properties
System properties determine Cloud Integration Hub behavior, such as showing events and identifying load
balancer. You can access the System Properties page from the System Properties link on the top right of the
Cloud Integration Hub Home page. To configure and edit the system properties in Cloud Integration Hub, you
must be assigned the Admin role.
dih.console.accessmode Access mode for the Data Integration Hub console to show Data
Integration Hub events in Cloud Integration Hub.
If the Cloud Integration Hub server can access Data Integration Hub
REST APIs, set the value to direct.
If the Cloud Integration Hub server can't access Data Integration Hub
REST APIs, set the value to cihprocessor. Your organization must have a
valid CIHProcessor license in Informatica Intelligent Cloud Services,
and CIHProcessor must be able to access the Data Integration Hub
REST APIs.
dih.console.username User name of the user account of the Data Integration Hub console.
dih.console.password Password of the user account of the Data Integration Hub console.
You can manage your Informatica Intelligent Cloud Services projects and assets in the following ways:
• View assets.
• Edit assets.
• Move folders or assets to other locations on the Explore page.
• Delete projects, folders, or assets.
• Export assets, import assets, and migrate assets from one organization to another organization. Assets
include applications, topics, publications, subscriptions, and monitoring rules.
• Apply tags so you can filter for related assets on the Explore page.
For more information about additional actions that you can perform on assets and for asset properties, see
the chapters relevant to the asset type.
Viewing an asset
Use the Explore page to view assets, such as applications, topics, publications, and subscriptions. When you
view a topic, the topic diagram appears by default. The topic diagram displays a graphical representation of
the topic and the applications, publications, and subscriptions that are associated with the topic.
1. On the Explore page, navigate to the object that you want to view.
2. In the row that contains the object, click Actions and select View.
Tip: You can also view a publication or a subscription from the topic that the asset is associated with by
right-clicking the asset on the Publications or Subscriptions area of the topic page and selecting View.
The asset appears.
31
Editing an asset
Use the Explore page to edit assets.
1. On the Explore page, navigate to the object that you want to edit.
2. In the row that contains the object, click Actions and select Edit.
The asset appears.
3. Edit the asset and then click Save.
Editing a topic
You can edit a topic to change the topic structure.
1. On the Explore page, in the row that contains the object, click Actions and select Edit.
The topic page appears. You can expand or collapse the areas of the page.
2. Perform one or more of the following tasks:
• To edit the general details of the topic, scroll to the General Details area.
• To edit the topic structure, scroll to the Topic Structure area.
• To create, edit, disable, or delete publications that publish to the topic, scroll to the Publications area.
• To create, edit, disable, or delete subscriptions that subscribe to the topic, scroll to the Subscriptions
area.
3. Click Save.
1. On the Explore page, navigate to the folder or asset that you want to move.
2. If your organization has enabled source control, check out the folder or assets that you want to move.
If you want to move a folder, be sure to check out the folder and each of the assets within the folder.
3. In the row that contains the folder or asset, click Actions and select Move To, and then browse to the
new location.
4. If the folder or assets are checked out, check them in so that the Git repository reflects the new
structure.
Delete a project, folder, or asset from the Explore page, as shown in the following image:
1. To delete a project, folder, or asset, on the Explore page, navigate to the object that you want to delete.
2. In the row that contains the project, folder, or asset, click Actions and select Delete.
Tip: You can also delete a publication or a subscription from the topic that the asset is associated with
by right-clicking the asset on the Publications or Subscriptions area of the topic page and selecting
Delete.
User roles
A role is a collection of privileges that you can assign to users and groups. To ensure that every user can
access assets and perform tasks in your organization, assign at least one role to each user or user group.
Administrators assign roles for the organization in Administrator. For more information, see User roles in the
Administrator help.
To perform actions on Cloud Integration Hub assets, including applications, monitoring rules, publications,
subscriptions, and topics, Cloud Integration Hub users need privileges for the assets that they will use. For
example, to run publications, users need run privileges for the Hub Publication asset. The Informatica
Intelligent Cloud Services system-defined roles Designer, Admin, and Monitor define access privileges for
Cloud Integration Hub assets.
The Designer and Admin roles grant the following privileges for Cloud Integration Hub assets:
To configure and edit the system properties, users must be assigned the Admin role.
Monitor role
The Monitor role grants read privileges for all Cloud Integration Hub assets.
Privileges
Privileges determine the access a user has at the object level. You can configure privileges for object types at
the user group-level or configure privileges for specific objects in object-level privileges. Privileges add
additional or custom security for an object. Privileges define which users and groups can read, update,
delete, execute, and change privilege on the object.
Administrators assign privileges for the organization in Administrator. For more information, see the
Administrator help.
To perform actions in Cloud Integration Hub, Cloud Integration Hub users need the following privileges:
Administrator service
Read privileges for Organization, Secure Agent, Secure Agent Group, and User assets.
Read privileges for Connection, Mapping Task, and Synchronization Task assets.
You can assign privileges for Cloud Integration Hub assets by assigning user roles to users and user
groups. You can either use the Informatica Intelligent Cloud Services system-defined roles Designer,
Admin, or Monitor, or define custom roles. For more information about user roles in Cloud
Integration Hub, see “User roles” on page 33.
To perform actions in Informatica Intelligent Cloud Services for Cloud Integration Hub operations, for
example, to develop mappings and to create tasks, Informatica Intelligent Cloud Services users need the
following privileges:
Permissions
Permissions determine the access rights that a user has for a Secure Agent, Secure Agent group, connection,
schedule, or asset. Permissions add additional or custom security for an object. Permissions define which
users and groups can read, update, delete, execute, and change permissions on the object.
To configure permissions on an object, you need the following licenses and privileges:
• To configure permissions at the project level for all assets in a project, your organization must have the
Set/Unset Security Permissions at Project Level license.
• To configure permissions at the folder level for all assets in a folder, your organization must have the Set/
Unset Security Permissions at Folder Level license.
• To configure permissions on individual assets, your organization must have the Fine Grained Security
license.
• The role assigned to your user account or to a group in which you are a member must have the Set
Permission privilege for the object type. For example, to configure permissions on a Secure Agent, you
must be assigned a role that has the Set Permission privilege for Secure Agents.
To configure permissions on an object, navigate to the object and set the appropriate permissions. For
example, you want only users in the Development Team user group to have access to assets in the
Development Data folder. Navigate to the folder, edit the permissions, and grant the Development Team user
group permissions on the folder.
Permissions apply to the objects for which you configure them but not to copies of the object. Therefore,
when you copy or export an asset, the permissions are not copied or exported with the asset. For example,
you export a mapping task in which only user rjones has execute permission. When you import the mapping
task, the imported mapping has no permissions assigned to it. Therefore, any user with privileges to run
mapping tasks can run the imported task.
Permissions 35
You can configure the following permissions on an object:
Permission Description
Note: These permissions control permissions within Informatica Intelligent Cloud Services. They do not
control operating system permissions, such as the ability to start, stop, or configure the Secure Agent on
Windows or Linux.
• When you configure permissions on an object, verify that the user or group to which you grant
permissions is assigned a role with the appropriate privileges for the object type. For example, if you grant
a user with the Service Consumer role Update privilege on a particular folder, the user cannot update the
folder because the Service Consumer role does not have update privileges for folders.
• To edit an asset, the user must have read permission on all assets used within the asset. For example,
when you assign a user Read and Update permissions on a synchronization task, verify that the user also
has Read permission on the connections, mapplets, schedules, and saved queries that are used in the
task.
• When a user edits a task, assets without Read permission are not displayed. To avoid unexpected results,
the user should cancel all changes and avoid editing the task until the user is granted the appropriate
Read permissions.
• When configuring a taskflow, a user needs Execute permission on all tasks to be added to the taskflow.
• To edit a taskflow, a user needs Execute permission on all tasks in the taskflow. Without Execute
permission on all tasks, the user cannot save changes to the taskflow.
• To run a taskflow, a user needs Read and Execute permissions on taskflows.
• To monitor jobs or to stop a running job, a user needs Execute permission on the mapping, task, or
taskflow.
Configuring permissions
You can configure permissions on an object if you are assigned a role with the Set Permission privilege for
the object type. For example, to configure permissions on a folder, you must be assigned a role that has the
Set Permission privilege for folders.
Permissions 37
3. To configure user permissions on the object:
a. Select Users.
b. If the user does not appear in the Users list, click Add, and select a user.
c. Enable or disable the appropriate permissions on the user.
Note: When you grant any user permissions on the object, Informatica Intelligent Cloud Services also
adds you as a user with permissions on the object. This prevents you from losing access to the object
when you configure permissions.
4. To configure user group permissions on the object:
a. Select Groups.
b. If the group does not appear in the Groups list, click Add, and select a group.
c. Enable or disable the appropriate permissions on the group.
Note: When you grant any group permissions on the object, Informatica Intelligent Cloud Services also
adds you as a user with permissions on the object. This prevents you from losing access to the object
when you configure permissions.
5. To remove all permissions restrictions for the object, remove all users and groups from the Permissions
dialog box.
When you remove all users and groups, any user with appropriate privileges for the object type can
access the object.
6. Click Save.
Asset migration
You can migrate Cloud Integration Hub assets from one organization to another organization. Assets include
applications, topics, publications, subscriptions, and monitoring rules.
The process to migrate assets depends on whether or not the source and target organizations reside on the
same PoD (Point of Delivery):
• To migrate assets between organizations that reside on different PoDs, you export the assets from the
source organization and then import the assets into the target organization. For more information, see
“Exporting assets” on page 39 and “Importing assets” on page 39.
• To migrate assets between organizations that reside on the same PoD, you run the org to org migration
process. For more information, see “Migrating assets between organizations” on page 40.
Before you start the migration process, note the following considerations:
• When you migrate publications and subscriptions that publish and consume data with an API, Cloud
Integration Hub changes the API URL based on the URL of the target organization. Be sure to inform API
users of the new URL. After the migration is complete, you can copy the new URL from the publication or
subscription page.
• You cannot migrate a publication or subscription with the same name as a publication or subscription that
you used previously and later renamed or deleted.
Cloud Integration Hub does not export, import, or migrate assets that users created in other Informatica
Intelligent Cloud Services services and that the users later associated with Cloud Integration Hub assets. For
example, Cloud Integration Hub does not export, import, or migrate Data Integration mappings or tasks. For
more information about asset dependencies, see “Asset dependencies” on page 41.
Exporting assets
Export Cloud Integration Hub assets from the organization to an export file. You can select a single asset or
multiple assets to export, or you can export all assets in the organization. You can then import the assets to
another organization.
1. Click the Migration link in the upper-right corner of the Home page.
2. In the Export tab, click Select Entities.
The Select Entities page appears.
3. From the Entity Type list, select the types of assets to export. You can select All to export all asset
types.
Assets of the type or types that you select show in the Available Entities list.
4. In the Available Entities list select the assets to export and then click Add. To select all assets, click Add
All.
The assets to export show in the Selected Entities list.
5. In the Select Entities page, click OK.
The assets to export show in the Export tab. If there are conflicts, a conflict resolution shows next to the
relevant asset.
Note: You cannot remove dependent Cloud Integration Hub assets from the export list without removing
the parent asset.
6. Click Export.
7. In the Save As dialog box, define the location and name of the file to export the assets to, and then click
Save.
Cloud Integration Hub exports the assets and their dependent Cloud Integration Hub assets to the export
file.
Importing assets
Import Cloud Integration Hub assets to the organization from a Cloud Integration Hub export file.
1. Click the Import link in the upper-right corner of the Home page and then select the Import tab.
2. In the Conflict Resolution Rules area, choose the actions to take when assets that you select to import
exist in the organization. Select one of the following resolutions for each asset type:
• Overwrite. Overwrite the asset with the imported asset. Overwritten assets cannot be recovered.
• Reuse. Do not import the asset and keep the existing asset.
Asset migration 39
• Cancel. Cancel the import operation.
3. In the Select Entities page, click OK.
The assets to import show in the Import tab. If there are conflicts, a conflict resolution shows next to the
relevant asset.
4. Click Import, select the export file in the Open dialog box, and then click Open.
Cloud Integration Hub imports the selected assets and their dependent Cloud Integration Hub assets to
the organization. If a selected asset exists in the organization, the action that Cloud Integration Hub
takes depends on the conflict resolution that you defined for the asset type. Import results and conflicts
appear in the Import tab.
Before you begin the migration process, verify that the following conditions exist:
• You have Informatica Intelligent Cloud Services login credentials for the source organization.
• The source organization is provisioned to Cloud Integration Hub.
1. Click the Migration link in the upper right corner of the Home page and then select the Org to Org Asset
Migration tab.
2. In the Source Organization area, click Log in, and then log in to the Informatica Intelligent Cloud Services
organization that contains the assets to migrate.
3. In the Conflict Resolution Rules area, choose the actions to take when assets that you select to migrate
exist in the target organization. Select one of the following resolutions for each asset type:
• Overwrite. Overwrite the target asset with the source asset. Overwritten assets cannot be recovered.
• Reuse. Do not migrate the source asset and keep the existing target object.
• Cancel. Cancel the entire migration operation.
4. In the Entities to Migrate area, click Select.
The Select Entities page appears.
5. From the Entity Type list select the types of assets to migrate, or select All to migrate all asset types.
Assets of the selected types show in the Available Entities list.
6. In the Available Entities list select the assets to migrate and then click Add. To select all assets, click
Add All.
The assets to migrate show in the Selected Entities list.
7. In the Select Entities page, click OK.
The assets to migrate show in the Org to Org Asset Migration tab.
8. Click Migrate.
Cloud Integration Hub migrates the selected assets and their dependant Cloud Integration Hub assets to
the target organization. If a selected asset exists in the target organization, the action that Cloud
Integration Hub takes depends on the conflict resolution that you defined for the asset type. Migration
conflicts and results appear in the Org to Org Asset Migration tab.
If Cloud Integration Hub encounters a problem in creating or updating the structure, the state of the topic
might change to not valid. To make the topic valid, perform one of the following actions:
Asset dependencies
You can view object dependencies for an asset. You might want to view object dependencies before
performing certain operations on an asset.
For example, you cannot delete an asset if another object depends on the asset. You must first delete the
dependent objects and then delete the asset. You can find the dependent objects by viewing the asset
dependencies.
You can view object dependencies for Cloud Integration Hub assets from the topic or application pages and
from the relationship diagram on the Hub Overview page. To view object dependencies, click an asset. The
topic page, application page, or relationship diagram opens, showing the object dependencies.
The Uses tab lists the objects that the selected asset uses.
The Used By tab lists the objects that use the selected asset.
To drill down to the lowest level dependency, you can continue to show dependencies for each asset that
appears on the Dependencies page. At the top of the Dependencies page, a breadcrumb shows the chain of
dependencies.
The following image shows that the asset mt_FilterArchCustRecords is dependent on m_FilterCustRecords,
which is dependent on FF_USW1PF:
To view or delete an asset, in the row that contains the asset, click Actions and select the action.
Tags
A tag is an asset property that you can use to group assets. Create tags to filter for assets that share a
common attribute on the Explore page.
For example, your regional offices manage the assets that only apply to their region. Each of your
organization's assets includes a tag that identifies the asset by region. You want to view all of the assets that
the Southwest regional office manages. On the Explore page, you explore by tag and then click the SW
Asset dependencies 41
Region tag, as shown in the following image:
You can assign tags to all asset types. An asset can have up to 64 tags.
You can find all of the assets that have a particular tag using one of the following methods:
• Click the name of the tag in the Tags column, in any row.
• Explore by tag, and then in the list of tags that shows on the page, click the name of the tag.
The following image shows an Explore page that lists all the tags created for the organization:
Click the name of a tag to see a list of all the assets associated with the tag.
Creating tags
You can create multiple tags to assign to assets.
You can create tags that you want to use for an asset when you configure the asset properties, or you can
create multiple tags to be available for future use. To create multiple tags for future use, you use an asset's
Properties dialog box.
Follow this procedure if you want to create multiple tags without assigning them to an asset.
5. After you have entered the tags, delete the tags from the Tags field so that the asset does not become
associated with the tags. The tags will still appear in the list of available tags.
6. Click Save.
Assigning tags
You can assign a tag to one asset at a time or assign a tag to multiple assets at the same time. You can
assign multiple tags to one asset.
When you assign tags to an asset, you can choose an existing tag or create a new one.
Tags 43
• To assign tags to multiple assets at the same time, in the row for each asset, select the check box.
After you have selected all of the assets, from the Selection menu, select Tags.
Edit a tag name or description in the tag properties. When you edit a tag, the properties for associated assets
update as well. For example, if your m_sales asset has the NorthWest tag and you change the name of the
tag to NW, the name of the tag changes to NW in the m_sales asset properties.
If you delete a tag, the tag no longer appears in the asset properties.
Applications
An application represents an entity in your organization that needs to share data with other applications in
your organization, such as sales applications or customer service applications. In Cloud Integration Hub, an
application is a container for publications and subscriptions.
An application can publish data to a defined topic and can subscribe to data from a topic. For example, a
sales application can publish sales reports and subscribe to inventory updates from an operations
application. When you add a publication to an application, you define the schedule according to which topic
data will be published from the application. You also define the schedule according to which topic data will
be retrieved from the application and published to the Cloud Integration Hub publication repository. When you
add a subscription to an application, you define the topic to which the application subscribes and the
schedule and scope of data that the application consumes from the topic. The topic defines the structure of
the data that the associated publications and subscriptions publish and consume.
Application management
Create applications and add a publication or a subscription to an application.
Creating an Application
Use the Navigator to create applications.
1. In the Navigator, click Explore. Click the All Assets list and then select Hub Management > Applications.
The Explore page shows all existing applications. You can sort the display by name, description, or last
modified.
45
2. Rest on the application, click the Actions menu at the right end of the line, and then, from the menu,
select Add Publication or Add Subscription.
The New Publication or New Subscription page shows. Define and save the publication or subscription.
Application properties
Application properties include general information about the application, a list of the publications that are
associated with the application, and a list of the subscriptions that are associated with the application.
Application Name
Name of the application. The name can contain up to 60 characters and can contain special characters.
Description
46 Chapter 4: Applications
Chapter 5
Topics
A topic is an entity that represents a data domain that is published and consumed in Cloud Integration Hub. A
topic defines the data structure and additional data definitions, such as the data retention period. Multiple
applications can publish to the same topic. An application can subscribe to multiple topics.
For example: Create an Accounts topic into which two CRM applications, a current application and a legacy
application, publish accounts data. The marketing application and the data warehouse subscribe to the data
in the Accounts topic.
Topic structure
When you create the structure of a topic, you define the data structure on the publication repository to where
the publications that are associated with the topic publish data, and from where subscribers to the topic
consume the data. The topic structure can consist of multiple tables.
When you create a topic, Cloud Integration Hub generates the tables in the publication repository where it
retains the data that is published for the topic. Cloud Integration Hub uses the data structure for the
publications and subscriptions that are associated with the topic.
• Create a table from a connection. Use this method when the structure of a table in the data domain that
the topic represents exists in a connection object. You can use relational, flat file, and Salesforce
connections to create topic tables.
• Create a table from a flat file. Use this method when the structure of a table in the data domain that the
topic represents exists in a flat file.
• Create a table from a metadata file. Use this method when the structure of a table in the data domain that
the topic represents exists in a JSON, XML, XLS, or XLSX file. For more information, see “Using metadata
files to create topic tables” on page 48.
• Create a new table. Use this method to define the structure manually if the structure of the table does not
exist in a compatible file.
You can use more than one method to create tables in a single topic. For example, create two tables from a
flat file, create three tables from a metadata file, and create a new table.
Note: If you add a table or table column to a topic with associated publications or subscriptions, to publish
and to consume the additional data, edit the mapping to include the additional table or column. If you do not
47
update the mapping, Cloud Integration Hub won't publish the additional data to the publication repository and
subscribers won't receive it.
When you use a metadata file to create a topic table, you can define table attributes in the file before you
load it to Cloud Integration Hub. For example, define column data type and precision, or define a column as a
filter accelerator that is not encrypted.
You can use JSON, XML, XLS, and XLSX metadata files to create topic tables.
The metadata file must contain the following fields, and must not contain any other fields:
columnName
Mandatory. Name of the table column. The name must begin with an alphabetic character or underscore
and can contain only alphanumeric characters or underscores.
filterAccelerator
Optional. Indicates that the column will be used in subscription queries and requires performance-
related handling by Cloud Integration Hub. Use this indicator with topics that you plan to use for unbound
subscriptions. By default, false.
• Filter accelerators slow down the writing of publication data to the Cloud Integration Hub publication
repository.
• Filter accelerators have no impact on subscriptions that do not use filters.
• On a hosted Cloud Integration Hub publication repository, by default, Cloud Integration Hub encrypts
the topic data. To use a column as a filter accelerator, you must define the value of the column's
encryption field to false.
datatype
• string
• decimal
• double
• int32
• int64
• date_time
• text
precision
Optional. Applies to data types that support precision. The default precision value depends on the data
type of the field:
• String: 255
• Decimal: 15
48 Chapter 5: Topics
• Text: 50000
scale
Optional. Applies to data types that support data scaling. The default scale value depends on the data
type of the field:
• Decimal: 0
• All other data types: empty
encryption
Optional.
If a file doesn't contain all the required fields, or contains non-required fields, loading the file to Cloud
Integration Hub fails.
If a file contains identical rows, Cloud Integration Hub adds only the first row to the topic table.
</table>
Example table in an XLS or XLSX file
Topic structure 49
Topic structure updates
When you edit the structure of a topic with associated publications or subscriptions, it might affect the
associated publications and subscriptions. Topic structure changes might also impact the data in the
publication repository and sometimes cause data loss.
Based on the nature of the update, you might have to edit the associated publications and subscriptions to
align with the updated topic structure. The following table describes the effects of topic structure updates on
data in the publication repository and the resulting optional or required changes to the associated
publications and subscriptions.
Add table Table added Optional: To publish and to consume the additional table,
edit the mapping to include the additional table.
If you do not update the mapping, data in the table will
not be published to the publication repository and
subscribers will not receive it.
Delete table Table deleted, including Remove references to the table from the mapping of
data that was published to publications and from the mapping and the filter of
the table subscriptions.
Add column Column added Optional: To publish and to consume the additional
column, edit the mapping to include the additional
column.
If you do not update the mapping, data in the column will
not be published to the publication repository and
subscribers will not receive it.
Delete column Column deleted, including Remove references to the column from the mapping of
data that was published to publications and from the mapping and the filter of
the column subscriptions.
Rename column Column deleted, including Remove references to the changed column from the
data that was published to mapping of publications and from the mapping and the
the column, and another filter of subscriptions.
column created with new Optional: To publish or to consume the column that is
name created with a new name, edit the mapping to include the
new column.
If you do not update the mapping, data in the new column
will not be published to the publication repository and
subscribers will not receive it.
Change column data type Column deleted, including Remove references to the changed column from the
data that was published to mapping of publications and from the mapping and the
the column, and another filter of subscriptions.
column created with new Optional: To publish or to consume the column that is
data type created with the new data type, edit the mapping to
include the new column.
If you do not update the mapping, data in the new column
will not be published to the publication repository and
subscribers will not receive it.
50 Chapter 5: Topics
Topic Structure Update Effect on Data in Optional/Required Changes to Associated
1
Publication Repository Publications and Subscriptions
Increase column precision, Column updated Open the publication or the subscription page for all
scale unchanged associated publications and subscriptions. You do not
need to edit any of the publication or subscription
settings.
Increase column precision, Column updated Open the publication or the subscription page for all
increase scale by a lower associated publications and subscriptions. You do not
value than the precision need to edit any of the publication or subscription
increase or by the same settings.
value as the precision
increase
Any other precision or Column deleted, including Remove references to the changed column from the
scale updates data that was published to mapping of publications and from the mapping and the
the column, and another filter of subscriptions.
column created with Optional: To publish or to consume the column that is
updated precision or scale created with the new precision or the new scale, edit the
mapping to include the new column.
If you do not update the mapping, data in the new column
will not be published to the publication repository and
subscribers will not receive it.
1. Deleting columns in the publication repository might take a long time, based on the number of rows in the table.
The retention period for consumed data defines how long Cloud Integration Hub retains consumed data in
the publication repository if all the subscribers consume it. For each publication instance, the retention
period for consumed data starts if all the subscribers have either successfully consumed or discarded the
data, and is between 1 and 90 days. That is, after all the events that are associated with the publication
instance are either in a Complete or in a Discarded event status. If all the subscribers consume or discard the
data, Cloud Integration Hub stores the consumed data in the publication repository until the retention period
for consumed data expires, and then deletes the consumed data from the publication repository.
The retention period for unconsumed data defines how long Cloud Integration Hub retains unconsumed data
in the publication repository before it deletes the data. The retention period for unconsumed data is between
the retention period for consumed data and 90 days.
Topic management
Create topics, add publications and subscriptions to topics, and subscribe to topics.
4. Choose whether to prevent new publications and new subscriptions to the topic. If you choose this
option you cannot create publications and subscriptions that publish to and subscribe from the topic.
5. Enter the number of days to retain consumed data in the publication repository in the Retention period
for consumed data field. Enter a value between 1 and 90 days. For each publication instance, the
retention period for consumed data starts if all the subscribers have either successfully consumed or
discarded the data. That is, after all the events that are associated with the publication instance are
either in a Complete or in a Discarded event status.
6. Enter the number of days to retain unconsumed data in the publication repository in the Retention period
for unconsumed data field. Enter a value between the retention period for consumed data and 90 days.
7. Click Create Table From and select one of the following methods:
• Create a table from a connection. Use this method when the structure of a table in the data domain
that the topic represents exists in a connection object. You can use relational, flat file, and Salesforce
connections to create topic tables.
• Create a table from a flat file. Use this method when the structure of a table in the data domain that
the topic represents exists in a flat file.
• Create a table from a metadata file. Use this method when the structure of a table in the data domain
that the topic represents exists in a JSON, XML, XLS, or XLSX file. For more information, see “Using
metadata files to create topic tables” on page 48.
• Create a new table. Use this method to define the structure manually if the structure of the table does
not exist in a compatible file.
8. Define the table in the create table dialog box and then click OK.
The structure of the table shows in the Topic Structure area.
9. Add the number of tables that you require to the topic. You must add at least one table to the topic. You
can use multiple methods to add tables to the topic.
To edit or to delete a topic table, rest on a row in the table and click the Action menu at the right end of
the line. From the menu select the required action: add row, rename table, delete row, or delete table.
10. Click Save.
The topic page shows the Topic Diagram.
11. Optionally, add publications and subscriptions to the topic. Perform one or both of the following actions:
• To add a publication to the topic, expand the Publications area and click New Publication. For more
information about creating publications, see Creating a publication Use the Navigator to create
publications. .
52 Chapter 5: Topics
• To add a subscription to the topic, expand the Subscriptions area and click New Subscription. For
more information about creating subscriptions, see Creating a subscriptionUse the Navigator to
create subscriptions. .
Subscribing to a topic
Use the Explore page to subscribe to a topic.
1. On the Explore page, navigate to the object that you want to subscribe to a topic.
2. In the row that contains the object, click Actions . Select Subscribe and then configure the subscription.
Topic properties
Topic properties include general information about the topic, the topic structure, and the publications and
subscriptions that are associated with the topic.
• Topic Diagram. Provides a visual overview of the topic and its relations to other assets. You can perform
actions on assents in the diagram. For more information, see “Topic Diagram” on page 53.
• General Details. General information about the topic. For more information, see “General Details
properties” on page 54.
• Topic Structure. List of topic tables, including details about each table. You add topic tables to the topic in
this area. For more information, see “Topic Structure properties” on page 55.
• Publications. List of publications that publish data to the topic, including information about each
publication. You can perform actions on existing publications and create new publications in this area. For
more information, see “Publications properties” on page 58.
• Subscriptions. List of subscriptions that subscribe to data from the topic, including information about
each subscription. You can perform actions on existing subscriptions and create new subscriptions in this
area. For more information, see “Subscriptions properties” on page 59.
You can collapse and expand each area on the topic page.
Topic Diagram
The topic page shows the Topic Diagram. The diagram provides a visual overview of the topic and its
relations to other assets, including the following assets:
Topic properties 53
When you click an asset, the properties page for the asset appears. For example, when you click a
publication, the publication page appears.
When you right-click an asset, you can open it for viewing and editing. You can also run publications and
subscriptions that trigger Data Integration tasks.
Topic Name
Name of the topic. The name must begin with an alphabetic character or underscore and can contain
only alphanumeric characters or underscores.
Description
Topic Type
Type of the topic. Topic type depends on the type of data that applications publish to the topic and has
an impact on the delivery options to the subscribers to the topic.
• Incremental Load. The topic instance contains only the latest data changes. If you choose this topic
type, verify that the data sources include delta indicators.
• Full Load. The topic contains all of the data changes that occurred after the last publication.
Topic Type
Type of the topic. Topic type depends on the type of data that applications publish to the topic and has
an impact on the delivery options to the subscribers to the topic.
• Delta. The topic instance contains only the latest data changes. If you choose this topic type, verify
that the data sources include delta indicators.
• Full. The topic contains all of the data changes that occurred after the last publication.
Prevent new publications from publishing to the topic and prevent new subscriptions from subscribing to
the topic. For example, when you plan to delete the topic. The topic is not available for selection when
creating publications and subscriptions.
54 Chapter 5: Topics
Existing publications can publish data to the topic and existing subscriptions can consume data from
the topic.
Determines how long Cloud Integration Hub retains consumed data in the publication repository before it
deletes the data. The retention period for consumed data must be between 1 and 90 days.
For each publication instance, the retention period for consumed data starts if all the subscribers have
either successfully consumed or discarded the data. That is, after all the events that are associated with
the publication instance are either in a Complete or in a Discarded event status.
Determines how long Cloud Integration Hub retains unconsumed data in the publication repository
before it deletes the data. The retention period for unconsumed data must be between the retention
period for consumed data and 90 days.
For each publication instance, the retention period for unconsumed data starts after the data is
published.
Add tables to the topic. The topic must contain at least one table.
You can use one or more of the following methods to add tables to the topic:
• Create a table from a connection. Use this method when the structure of a table in the data domain
that the topic represents exists in a connection object. You can use relational, flat file, and Salesforce
connections to create topic tables.
• Create a table from a flat file. Use this method when the structure of a table in the data domain that
the topic represents exists in a flat file.
• Create a table from a metadata file. Use this method when the structure of a table in the data domain
that the topic represents exists in a JSON, XML, XLS, or XLSX file. For more information, see “Using
metadata files to create topic tables” on page 48.
• Create a new table. Use this method to define the structure manually if the structure of the table does
not exist in a compatible file.
Show
Lists the tables in the topic. You can select to show a specific table.
The list of topic tables shows the following properties for each table:
Table
Name of the topic table. A topic table name must begin with an alphabetic character or underscore
and can contain only ASCII alphanumeric characters or underscores. The name must be unique in
the Cloud Integration Hub repository.
Column
Name of the table column. The name must begin with an alphabetic character or underscore and
can contain only alphanumeric characters or underscores.
Topic properties 55
Filter Accelerator
Indicates that the column will be used in subscription queries and requires performance-related
handling by Cloud Integration Hub. Use this indicator with topics that you plan to use for unbound
subscriptions.
• Filter accelerators slow down the writing of publication data to the Cloud Integration Hub
publication repository.
• Filter accelerators have no impact on subscriptions that do not use filters.
• In a hosted Cloud Integration Hub publication repository, by default, Cloud Integration Hub
encrypts the topic data. To use a column as a filter accelerator you must change the value of
Encrypted to No for the column.
Data Type
Select from the list of available data types. By default, Cloud Integration Hub reads the data as
string.
Precision
Enabled only for data types that support precision. For a String data type, the maximum precision
that Cloud Integration Hub supports is 1900 characters.
Scale
Encrypted
Determines whether or not Cloud Integration Hub encrypts the column data. On a hosted publication
repository, Cloud Integration Hub encrypts all columns by default. You can turn off the encryption
for specific columns, for example, for columns you plan to use as filters in your mappings.
The Add Table from a Connection page includes the following properties:
Connection
Connection that contains the object to create the topic table from.
Source Object
Formatting Options
Applies to flat file connections. Defines the delimiter, text qualifier, and escape character that are used in
the file.
Table Name
Name of the topic table. The name must begin with an alphabetic character or underscore and can
contain only ASCII alphanumeric characters or underscores. The name must be unique in the Cloud
Integration Hub repository.
56 Chapter 5: Topics
Add Table from Flat File properties
Add a topic table from a flat file that contains the structure of a table in the data domain that the topic
represents.
The Add Table from Flat File page includes the following properties:
File
Name of the file that contains the structure of the data domain that the topic represents.
Drop a file into the File field or click Choose File to browse to and choose the sample file on which to
base the table structure.
Table Name
Name of the topic table. The name must begin with an alphabetic character or underscore and can
contain only ASCII alphanumeric characters or underscores. The name must be unique in the Cloud
Integration Hub repository.
Optional. Select this option to use the column names in the file as the default column headers in the
table. Enter the number of the lines that serves as the file's header line in the From Line field.
Code page
Delimiter
Delimiter used in the file to separate between columns. Select a predefined delimiter or select Custom to
define a custom delimiter.
Text qualifier
Load File
Preview
Shows the columns that will be added to the table after you load the file.
The Add Table from Metadata File page includes the following properties:
File
Name of the file that contains the structure of the data domain that the topic represents.
Drop a file into the File field or click Choose File to browse to and choose the sample file on which to
base the table structure.
Topic properties 57
Table Name
Name of the topic table. The name must begin with an alphabetic character or underscore and can
contain only ASCII alphanumeric characters or underscores. The name must be unique in the Cloud
Integration Hub repository.
Load File
Loads the selected file and shows the status of the file, valid or invalid. If a file is valid and Cloud
Integration Hub converts source values to Cloud Integration Hub default values, the changes are listed in
the Create Table from Metadata File page. For more information, see “Using metadata files to create
topic tables” on page 48.
Table Name
Name of the table. The name must begin with an alphabetic character or underscore and can contain
only ASCII alphanumeric characters or underscores. The name must be unique in the Cloud Integration
Hub repository.
Number of columns
Publications properties
The Publications area of topic page includes the following properties:
New Publication
Create a publication that publishes data to the topic. For more information about creating publications,
see Creating a publication Use the Navigator to create publications. .
Publication list
List of publications that publish data to the topic. When you right-click a publication, an actions menu
opens. From the menu you can run, view, disable or enable, and delete the publication.
The publication list shows the following properties for each publication:
Name
Description
Mode
Publication mode, enabled or disabled. A disabled publication does not run according to schedule or
by an external API. You can only run a disabled publication from the Explore page or from the topic
page of the topic that the publication publishes to.
Last Modified
58 Chapter 5: Topics
Subscriptions properties
The Subscriptions area of topic page includes the following properties:
New Subscription
Create a subscription that consumes data from the topic. For more information about creating
subscriptions, see Creating a subscriptionUse the Navigator to create subscriptions. .
Subscription list
List of subscriptions that consume data from the topic. When you right-click a subscription, an actions
menu opens. From the menu you can run, view, disable or enable, and delete the subscription. You can
also get data that was published before the subscription subscribed to the topic and therefore was not
consumed by the subscriber.
The subscription list shows the following properties for each subscription:
Name
Description
Mode
Subscription mode, enabled or disabled. A disabled subscription does not run according to schedule
or by an external API. You can only run a disabled subscription from the Explore page or from the
topic page of the topic that the subscription subscribes to.
Last Modified
Topic properties 59
Chapter 6
To use a Data Integration task in a publication, you create a synchronization task or a mapping task in Data
Integration before you create the publication. You select the task when you create the publication.
To use a Data Integration task in a subscription, you can use one of the following methods:
• Create a synchronization task or a mapping task in Data Integration before you create the subscription,
and select the task when you create the subscription.
• Create a synchronization task when you create the subscription. Cloud Integration Hub saves the task in
Data Integration.
Note: Publications and subscriptions that publish and consume data with an API use the Cloud Integration
Hub REST APIs. For more information, see Chapter 10, “Cloud Integration Hub REST APIs” on page 95.
Use a synchronization task for a publication or a subscription where the publication or subscription process
requires mappings and filters that synchronization tasks support. For example, to read data from a CRM
application and publish the data as is.
Use a mapping task for a publication or a subscription if you want to use an advanced ETL (Extract,
Transform, and Load) process for the Cloud Integration Hub publication or subscription process. For
example, you can use a mapping task to perform the following actions on a publication or subscription:
60
General rules and guidelines
Consider the following rules and guidelines when you create Data Integration mappings and tasks:
• Do not run tasks that you create for Cloud Integration Hub from within Informatica Intelligent Cloud
Services. You must run the tasks from Cloud Integration Hub by running the publication or the
subscription to which the task is associated.
• When you use the Cloud Integration Hub connection, the target object in a publication mapping or task
and the source object in a subscription mapping or task presents the list of topics defined in Cloud
Integration Hub. The format of the list is TopicName/tableName.
Warning: When you set up the organization in Cloud Integration Hub, Cloud Integration Hub creates the
connection Cloud Integration Hub in the organization in Informatica Intelligent Cloud Services. Do not
rename or edit this connection. Editing the connection or changing the connection name might result in
errors at run time.
• Cloud Integration Hub determines the scheduling of the publication or the subscription based on the
settings that the operator defined for the publication or the subscription. When you create the Data
Integration task, in the Schedule page of the task wizard, verify that the option Do not run this task on a
schedule is selected.
• To distinguish between publication tasks and subscription tasks, indicate the type of the task in the task
name. When you select a task for a publication or for a subscription, you can easily select an appropriate
task.
For example, name a publication task Pub_<TaskName>, and name a subscription task Sub_<TaskName>.
- Subscriptions: when you subscribe to multiple tables, or when the subscription is a compound
subscription.
• Cloud Integration Hub supports the following connection types in synchronization tasks that you create
for subscriptions in Cloud Integration Hub:
- Relational database
- Salesforce
- Flat file
• The mapping operation is an insert operation for both publication and subscription mappings.
• When you create a publication mapping, select the Cloud Integration Hub connection when you configure
the target properties. When you create a subscription mapping, select the Cloud Integration Hub
connection when you configure the source properties.
For publications and subscriptions that require additional data processing use mapping tasks.
Description
Optionally, enter a description for the task. The description can contain up to 255 characters.
Task Operation
Choose Insert.
3. Click Next
The Source page appears.
Select a source connection that connects to the source from which you want to publish data.
Source Type
The source type depends on the number of tables that you want to publish:
• To publish a single table, select Single.
• To publish multiple tables, select Multiple and then create a relationship between the tables.
Source Object
Target Object
Select the topic table to which you want to publish data. The format of the target object is
TopicName/tableName.
2. Click Next.
The Data Filters page appears.
3. Optionally, configure data filters. You configure data filters for Cloud Integration Hub publications in the
same way that you configure data filters for other Data Integration tasks.
4. Click Next.
The Field Mapping page appears.
1. Map fields in the Source column to fields in the Target column and then click Next.
The Schedule page appears.
2. Verify that the option Do not run this task on a schedule is selected. The task runs according to the
schedule of the publication that uses the task.
3. Select Save > Save and Close to save the task.
Tip: You can also create a synchronization task for a subscription in Cloud Integration Hub. For more
information, see “Creating a subscription that triggers a Data Integration task” on page 79.
Description
Optionally, enter a description for the task. The description can contain up to 255 characters.
Task Operation
Choose Insert.
The source type depends on the number of tables that you want to consume and on the subscription
type:
• To consume a single table, select Single.
• To consume multiple tables, or when the subscription is a compound subscription, select
Multiple and then create a relationship between the tables.
Source Object
Select the topic table from which you want to consume data. The format of the object is TopicName/
tableName.
Select a target connection that connects to the target into which you want to consume data.
Target Object
Select the target into which you want to consume the data.
2. Click Next.
The Data Filters page appears.
3. Optionally, configure data filters. You configure data filters for Cloud Integration Hub subscriptions in the
same way that you configure data filters for other Data Integration tasks.
4. Click Next.
The Field Mapping page appears.
1. Map fields in the Source column to fields in the Target column and then click Next.
The Schedule page appears.
2. Verify that the option Do not run this task on a schedule is selected. The task runs according to the
schedule of the publication that uses the task.
3. Select Save > Save and Close to save the task.
For publications and subscriptions that require mapping and filtering only, use synchronization tasks.
To use mapping configuration with Cloud Integration Hub Connector, perform the following tasks:
In publication mappings and tasks the source is the cloud application from which to publish data and the
target is the topic table in the Cloud Integration Hub publication repository to which the publication publishes
data.
The topic must exist in Cloud Integration Hub before you create the mapping.
The mapping must exist in Mapping Designer before you create the task.
1. In Data Integration, click New > Task > Mapping Task > Create.
2. Specify the following task details:
Task Name
Description
Optionally, enter a description for the task. The description can contain up to 255 characters.
Runtime Environment
Runtime environment that contains the Secure Agent to run the task.
Mapping
In subscription mappings and tasks the source is the topic table in the Cloud Integration Hub publication
repository from where to consume data and the target is the cloud application that consumes the data.
The topic must exist in the Cloud Integration Hub before you create the mapping and task.
The topic must exist in Cloud Integration Hub before you create the mapping.
The mapping must exist in Mapping Designer before you configure the task.
Description
Optionally, enter a description for the task. The description can contain up to 255 characters.
Runtime Environment
Runtime environment that contains the Secure Agent to run the task.
Mapping
Publications
Publications are entities that define how applications publish data to Cloud Integration Hub, including the
type, format, and schedule of data publication. Publications publish data to topics. Multiple publications can
publish to the same topic. The topic defines the structure to which the data is published.
Publications can publish from any type of source that Informatica Intelligent Cloud Services supports.
Publication types
You can use the following publication types to publish data with Cloud Integration Hub:
When the publication runs, the Cloud Integration Hub server triggers the Data Integration task that is
defined for the publication and instructs the Informatica Intelligent Cloud Services data engine to
retrieve the data from the publishing application. The data engine runs the Data Integration task, and
transfers the source data to the topic on the Cloud Integration Hub publication repository.
The Cloud Integration Hub Publish Data API publishes to a specific topic on the Cloud Integration Hub
publication repository.
Use this type of publication to publish small transactions from within a workflow, for example, from
within Application Integration.
Publication processes
The publication process depends on the publication type.
69
repository. After the publication process ends, each subscriber consumes the published data according to
the schedule and the filter that you define when you create the subscription.
1. When the publication is triggered, either according to schedule or by an external API, the Cloud
Integration Hub server triggers the Data Integration task that is defined for the publication through an
Informatica Intelligent Cloud Services REST API.
2. The publication process uses the Cloud Integration Hub cloud connector to write the data to Cloud
Integration Hub.
3. The Cloud Integration Hub server changes the status of the publication event to complete and triggers
subscription processing.
Publication mapping
For publications that trigger a Data Integration task, mapping is the data mapping between the publishing
source and the Cloud Integration Hub publication repository.
A publication runs a Data Integration task that reads from the source and publishes to the topic tables. Task
targets must include at least one of the topic tables, and must not include any target table that is not defined
in the topic.
You create the task in Data Integration and then select it when you create the publication in Cloud Integration
Hub. Cloud Integration Hub uses an Informatica Intelligent Cloud Services REST API to trigger the task, and
the Cloud Integration Hub cloud connector writes the published data to Cloud Integration Hub.
70 Chapter 7: Publications
Publication sources
Publications can publish from any type of source that Informatica Intelligent Cloud Services supports.
Publication schedules
For publications that trigger a Data Integration task, the publication schedule defines the frequency of the
publication. You can publish the data manually or by an external trigger, or publish the data at defined
intervals.
For file publications that are published manually, by an external trigger, or at defined intervals, and that
publish multiple files, all the files must be present in the source location when the publication starts.
Publication management
Create, disable, and enable publications, and run publications manually, including disabled publications.
• An application to publish the data from must exist. You can either use an existing application, or create
and save a new application.
• A topic to publish data to must exist. You can either use an existing topic, or create and save a new topic.
• A publication Data Integration task must exist.
Tip: You can also create publications on the topic page. For more information, see “Creating a topic” on page
52.
Publication sources 71
4. Select Publish with a Data Integration task.
5. Choose the application that publishes the data.
6. Choose the topic that the application publishes the data to.
7. Select the task that defines the publication mapping.
8. If the publication publishes large amounts of data, increase the write batch size to optimize the
performance of the publication.
Note: Increasing the batch size increases the memory consumption of the Secure Agent and might
impact the performance of the Secure Agent machine.
9. Select the method and the frequency of data publishing.
Manually or by an external trigger
No schedule. You can use the following methods to run the publication:
• Run manually from the Cloud Integration Hub explorer.
• Run by an API. Call a REST API that starts the publication.
For file publications that use this scheduling option and that publish multiple files, all the files must
be present in the source location when the publication starts.
By schedule
Runs the publication according to the defined schedule. Select one of the following options:
• Every n minutes. Runs the publication in intervals of up to 60 minutes. You select the number of
minutes from the list.
• Hourly. Runs the publication in intervals of up to 24 hours. You select the number of hours from
the list. The publication runs at the beginning of the hour. For example, if you enter 2, the
publication runs at 00:00, 02:00, and at consecutive two-hour intervals.
• Daily. Runs the publication at the same hour every day.
• Weekly. Runs the publication every week on one or more days at the same hour.
• Monthly. Runs the publication every month on a specific date or a specific day at the same hour.
Define the publication intervals in the Repeat running area.
For file publications that use this scheduling option and that publish multiple files, all the files must
be present in the source location when the publication starts.
10. Click Save.
• An application to publish the data from must exist. You can either use an existing application, or create
and save a new application.
• A topic to publish data to must exist. You can either use an existing topic, or create and save a new topic.
Tip: You can also create publications on the topic page. For more information, see “Creating a topic” on page
52.
72 Chapter 7: Publications
2. Enter the publication name. Optionally, enter a description for the publication.
3. Choose the publication mode, enabled or disabled. A disabled publication does not run according to
schedule or by an external API. You can only run a disabled publication from the Explore page or from
the topic page of the topic that the publication publishes to.
4. Select Publish data with an API.
5. Choose the application that publishes the data.
6. Choose the topic that the application publishes the data to.
7. Click Save.
You can copy the following URLs and use them in the request that runs the publication:
• URL of the REST API. Use this URL to publish the data.
• URL of the Swagger structure for the topic that the publication publishes data to. Use the structure in
the publication request. If a Swagger structure base URL is configured for the organization, Cloud
Integration Hub appends the base URL to the topic Swagger structure URL. For more information, see
“System Properties” on page 30.
Tip: You can also run publications manually on the topic page. For more information, see “Topic
properties” on page 53.
1. In the Navigator, click Explore. Click the All Assets list and then select Hub Management > Publications.
The Explore page shows all existing publications. You can sort the display by name, description, mode,
topic, or last modified.
2. Rest on the publication and click the Actions menu at the right end of the line. From the menu select
Run.
Tip: You can also disable and enable publications on the topic page. For more information, see “Publications
properties” on page 58.
1. In the Navigator, click Explore. Click the All Assets list and then select Hub Management > Publications.
The Explore page shows all existing publications. You can sort the display by name, description, mode,
topic, or last modified.
2. Rest on the publication click the Actions menu at the right end of the line. From the menu select Disable
or Enable, as required.
Publication management 73
Publication properties
Publication properties include general information about the publication, the application and topic to use for
the publication, and, for publications that trigger in Data Integration task, the task to run and the publication
scheduling.
Publication Name
Name of the publication. The name can contain up to 60 characters and can contain special characters.
Description
Mode
Publication mode, enabled or disabled. A disabled publication does not run according to schedule or by
an external API. You can only run a disabled publication from the Explore page or from the topic page of
the topic that the publication publishes to.
Publication Method
• Publish with a Data Integration task. The publication process triggers a Data Integration task to
retrieve the data from the publishing application and write the data to the topic on the Cloud
Integration Hub publication repository.
• Publish data with an API. Use the Publish Data REST API to publish the data to a specific topic in the
Cloud Integration Hub publication repository. Select this option for high-frequency publications of
small transactions.
After you configure the publication properties, you can copy the following URLs from the publication
page:
• URL of the REST API. Use this URL to publish the data.
• URL of the Swagger structure for the topic that the publication publishes data to. Use the structure
in the publication request.
You use the URLs when you create the request that runs the publication.
Application
Topic
Task
Task that defines the publication mapping. Applies to publications that trigger a Data Integration task.
Number of records that the Cloud Integration Hub connector writes to the publication repository in a
single batch. Applies to publications that trigger a Data Integration task.
Note: If you configure the Cloud Integration Hub connection to use JDBC for private publication
repository, batch size doesn't apply.
Scheduling
Method and frequency of data publishing. Applies to publications that trigger a Data Integration task.
74 Chapter 7: Publications
Manually or by an external trigger
No schedule. You can use the following methods to run the publication:
For file publications that use this scheduling option and that publish multiple files, all the files must
be present in the source location when the publication starts.
By schedule
Runs the publication according to the defined schedule. Select one of the following options:
• Every n minutes. Runs the publication in intervals of up to 60 minutes. You select the number of
minutes from the list.
• Hourly. Runs the publication in intervals of up to 24 hours. You select the number of hours from
the list. The publication runs at the beginning of the hour. For example, if you enter 2, the
publication runs at 00:00, 02:00, and at consecutive two-hour intervals.
• Daily. Runs the publication at the same hour every day.
• Weekly. Runs the publication every week on one or more days at the same hour.
• Monthly. Runs the publication every month on a specific date or a specific day at the same hour.
For file publications that use this scheduling option and that publish multiple files, all the files must
be present in the source location when the publication starts.
Publication properties 75
Chapter 8
Subscriptions
Subscriptions are entities that define how applications consume data from Cloud Integration Hub.
Subscriptions subscribe to topics. Multiple subscriptions can consume data from the dame topic.
Subscriptions can consume data into any type of target that Informatica Intelligent Cloud Services supports.
Subscription types
You can use the following subscription types to consume data with Cloud Integration Hub:
Subscriptions that trigger a Data Integration task can subscribe to multiple topics. When the
subscription runs, the Cloud Integration Hub server triggers the Data Integration task that is defined for
the subscription and instructs the Informatica Intelligent Cloud Services data engine to retrieve the data
from the topic or topics on the Cloud Integration Hub publication repository. The data engine runs the
Data Integration task, and transfers the data to the subscribing application.
For subscriptions that trigger a Data Integration task you can define the delivery behavior for the
published data, for example, to aggregate all data sets to a single data set, or to consume the latest
published data set. You can also configure a retry policy that defines the number of times Cloud
Integration Hub retries to run the subscription in case of failure and the retry interval.
Use this method to consume batch data into files, applications, and repositories.
The Cloud Integration Hub Consume Data API consumes data from a specific topic on the Cloud
Integration Hub publication repository.
Use this type of subscription for high frequency, event-driven subscriptions. For example, to consume
data that is published with the Publish Data API.
Subscription processes
The subscription process depends on the subscription type.
76
Subscription process for subscriptions that trigger Data
Integration tasks
When a subscription triggers an Data Integration task, the subscription process includes retrieving the
required data from the Cloud Integration Hub publication repository, running the subscription mapping, and
writing the data to one or more subscriber targets. Cloud Integration Hub keeps the data in the publication
repository until the retention period of the topic expires.
1. When the publication is ready for subscribers, the Cloud Integration Hub server triggers the Data
Integration task that is defined for the subscription through an Informatica Intelligent Cloud Services
REST API.
2. The subscription process uses the Cloud Integration Hub cloud connector to read data from Cloud
Integration Hub.
3. The Data Integration task reads the data from Cloud Integration Hub and then writes the data to the
cloud application.
4. The Cloud Integration Hub server changes the status of the subscription event to complete.
Note: For performance tuning purposes, Cloud Integration Hub writes the data to a folder on the local server
for intermediate staging, and then writes the data to the target location. Cloud Integration Hub deletes the
data from the local server at the end of the subscription process.
When you create or edit a subscription that consumes data with an API, you can define a notification URL.
Cloud Integration Hub sends notifications to this URL when data is ready to consume. Cloud Integration Hub
must be able to access the notification URL.
You can reconsume data that had previously been processed by triggering the subscription Completed event
with the Consume Data REST API.
Subscription mapping
For subscriptions that trigger Data Integration tasks, mapping is the data mapping between the Cloud
Integration Hub publication repository and the target that consumes the data.
A subscription runs a Data Integration task that includes information about the target data structure and the
database connection. The task reads from the topic tables and consumes the data into the target application.
Subscription mapping 77
You can create the task in Data Integration and then select it when you create the subscription in Cloud
Integration Hub, or create the task when you create the subscription. Cloud Integration Hub triggers the task
when the publication is ready for subscribers and uses the Cloud Integration Hub cloud connector to read the
data from Cloud Integration Hub.
You can create a compound subscription, where the subscription consumes data sets from multiple topics.
The subscription process starts after all publications from all topics publish data. You can specify the
maximum time to wait for all publications to finish publishing, from the time the first publication is ready to
consume.
Subscription targets
Subscriptions can consume data into any type of target that Informatica Intelligent Cloud Services supports.
Subscription schedules
For subscriptions that trigger Data Integration tasks, the subscription schedule defines the frequency of the
subscription. You can consume published data when it is published, manually, by an external trigger, or at
defined intervals. If you create a compound subscription, you can only choose to consume data when it is
published, manually, or by an external trigger.
Consumption of data by the subscription starts when one of the following conditions exist:
• The subscription schedule is set to consumes data immediately after the publisher publishes the data to
Cloud Integration Hub.
• The scheduled start time arrives.
• You start the subscription from a REST API.
• You manually run a subscription.
• You manually get previous publications.
You can define a policy of up to nine retry attempts with a retry interval that is between five minutes and 23
hours. Cloud Integration Hub attempts to reprocess subscription events in an Error status based on the
policy you define. Cloud Integration Hub doesn't attempt to reprocess Error events in the following scenarios:
78 Chapter 8: Subscriptions
When Cloud Integration Hub attempts to run a subscription according to the policy, the details of the
subscription event on the Events page indicate that the attempt was based on a retry policy.
When you define a retry policy for a subscription, make sure that the policy doesn't conflict with the
subscription schedule. If a conflict occurs, one of the Processing events is delayed and the subscription
consumes the data when it next runs according to its schedule.
Subscription management
Create, disable, and enable subscriptions, get previous publications for a subscription, and run a subscription
manually, including disabled subscriptions.
• An application or applications that consume data must exist. You can either use existing applications, or
create and save new applications.
• A topic from which to consume data must exist. You can either use an existing topic, or create and save a
new topic.
• If the subscription triggers a mapping task, a subscription task must exist in Data Integration. If the
subscription triggers a synchronization task, you can either select a subscription task that exists in Data
Integration or create the task.
Tip: You can also create subscriptions on the topic page. For more information, see “Creating a topic” on
page 52.
Subscription management 79
8. If you added more than one topic to the subscription, specify the maximum number of hours to wait for
all associated publications to finish publishing the data, after the first publication is ready for
consumption.
• If all the publications finish publishing the data during the time interval, the subscription process
starts after the last publication is ready for consumption.
• If one or more of the publications do not finish publishing the data during the time interval, the
subscription process is cancelled and no data is delivered.
9. If the task that defines the subscription mapping exists in Data Integration, choose the task. If not, click
Create New Task to create a synchronization task.
10. To create a synchronization task, enter the following properties in the Create New Task window and click
Create:
Task Name
Source
Select the topic table to consume data from. The format of the object is TopicName/tableName.
Connection
Select the connection that connects to the target to consume data to.
Target
Select the target table to consume the data to. The Create New Task window shows the first 200
tables in the list.
Cloud Integration Hub creates the task in the default folder and assigns the task to the subscription.
11. If the subscription subscribes to large amounts of data, increase the read batch size to optimize the
performance of the subscription.
Note: Increasing the batch size increases the memory consumption of the Secure Agent and might
impact the performance of the Secure Agent machine.
12. Select the method and the frequency of data consumption.
When published data is ready
No schedule. You can use the following methods to run the subscription:
• Run manually from the Cloud Integration Hub explorer.
• Run by an API. Call a command-line API or a REST API that starts the subscription.
If a file subscription uses this scheduling option and publishes multiple files, all the files must be
present in the source location when the subscription starts.
By schedule
Runs the subscription according to the defined schedule. Select one of the following options:
• Every n minutes. Runs the subscription in intervals of up to 60 minutes. You select the number of
minutes from the list.
80 Chapter 8: Subscriptions
• Hourly. Runs the subscription in intervals of up to 24 hours. You select the number of hours from
the list.
• Daily. Runs the subscription at the same hour every day.
• Weekly. Runs the subscription every week on one or more days at the same hour.
• Monthly. Runs the subscription every month on a specific date or a specific day at the same
hour.
Define the delivery intervals in the Repeat running area.
13. Optionally, in the Retry Policy area, select Reprocess Events in Error Status and then select the number
of times Cloud Integration Hub retries to run the subscription in case of failure and the retry interval. You
can define a policy of up to nine retry attempts with a retry interval that is between five minutes and 23
hours.
14. Click Save.
• An application or applications that consume data must exist. You can either use existing applications, or
create and save new applications.
• A topic from which to consume data must exist. You can either use an existing topic, or create and save a
new topic.
Tip: You can also create subscriptions on the topic page. For more information, see “Creating a topic” on
page 52.
Subscription management 81
Running a subscription manually
Use the Explore page to manually run subscriptions that trigger Data Integration tasks.
Tip: You can also run subscriptions manually on the topic page. For more information, see “Topic
properties” on page 53.
1. In the Navigator, click Explore. Click the All Assets list and then select Hub Management >
Subscriptions.
The Explore page shows all existing subscriptions. You can sort the display by name, description, mode,
topic, or last modified.
2. Rest on the subscription and click the Actions menu at the right end of the line. From the menu select
Run.
Tip: You can also get previous publications for a subscription on the topic page. For more information, see
“Subscriptions properties” on page 59.
1. In the Navigator, click Explore. Click the All Assets list and then select Hub Management >
Subscriptions.
The Explore page shows all existing subscriptions. You can sort the display by name, description, mode,
topic, or last modified.
2. Rest on the subscription for which to get previous publications and click the Action menu at the right end
of the line. From the menu select Get Previous Publications, define the date range for which to get the
publications, and then click Run.
Tip: You can also disable and enable subscriptions on the topic page. For more information, see
“Subscriptions properties” on page 59.
1. In the Navigator, click Explore. Click the All Assets list and then select Hub Management >
Subscriptions.
The Explore page shows all existing subscriptions. You can sort the display by name, description, mode,
topic, or last modified.
2. Rest on the subscription to disable or to enable and click the Action menu at the right end of the line.
From the menu select Disable or Enable, as required.
82 Chapter 8: Subscriptions
Subscription properties
Subscription properties include general information about the subscription, the applications, topic, and task
to use for the subscription, and subscription scheduling.
Subscription Name
Name of the subscription. The name can contain up to 60 characters and can contain special
characters.
Description
Mode
Subscription mode, enabled or disabled. A disabled subscription does not run according to schedule or
by an external API. You can only run a disabled subscription from the Explore page or from the topic
page of the topic that the subscription subscribes to.
Consumption Method
• Consume data with a Data Integration task. The subscription process triggers a Data Integration task
to retrieve the data from the topic or topics in the Cloud Integration Hub publication repository and
write the data to the subscribing application. Select this method to consume batch data into files,
applications, and repositories.
• Consume data with an API. Use the Consume Data REST API to consume the data from a specific
topic in the Cloud Integration Hub publication repository. Select this method for high frequency,
event-driven subscriptions.
After you configure the subscription properties, you can copy the following URLs from the
subscription page:
• URL of the REST API. Use this URL to consume the data.
• URL of the Swagger structure for the topic from which the subscription consumes data. Use the
structure in the subscription request.
You use the URLs when you create the request that runs the subscription.
Unbound Subscription
A subscription that is not restricted to specific publication instances. It consumes all the publication
events data in the publication repository for the topics that the subscription subscribes to.
Application
Notification URL
URL to where Cloud Integration Hub sends notifications when data is ready to consume. Applies to
subscriptions that consume data with an API.
Subscription properties 83
The notification URL cannot be authenticated and the HTTP request method must be POST. The payload
of the POST request must include the following parameters:
Parameter Description
publicationEventId ID of the event of the publication that published the data to consume.
For example:
{"publicationEventId":123, "subscriptionEventId" : 234, "subscriptionName" :
"payrollSubscription"}
Wait for all topics to be available for consumption for ... hours
Maximum time to wait until all published data is available from the time that the first topic is ready to
consume. Applies to compound subscriptions that consume data from multiple topics.
If all of the publications in all topics finish publishing the data before the maximum time, the
subscription process runs immediately after the last publication is ready to consume. If some
publications are not ready to consume within the maximum time, the subscription process does not run.
An error event is created, and no data is delivered.
Task
Task that defines the subscription mapping. Applies to subscriptions that trigger a Data Integration task.
Create a synchronization task that defines the subscription mapping. Applies to subscriptions that
trigger a Data Integration task.
Task Name
The name of the task must be unique within the organization. The task name is not case sensitive.
The task name can contain alphanumeric characters, spaces, and the following special characters:
_.+-
Source
Select the topic table to consume data from. The format of the object is TopicName/tableName.
Connection
Select the connection that connects to the target to consume data to.
Target
Select the target table to consume the data to. The Create New Task window shows the first 200
tables in the list.
Number of records that the Cloud Integration Hub connector reads from the publication repository in a
single batch. Applies to subscriptions that trigger a Data Integration task.
84 Chapter 8: Subscriptions
Scheduling
Method and frequency of data consumption. Applies to subscriptions that trigger a Data Integration
task.
When published data is ready
No schedule. You can use the following methods to run the subscription:
If a file subscription uses this scheduling option and publishes multiple files, all the files must be
present in the source location when the subscription starts.
By schedule
Runs the subscription according to the defined schedule. Select one of the following options:
• Every n minutes. Runs the subscription in intervals of up to 60 minutes. You select the number of
minutes from the list.
• Hourly. Runs the subscription in intervals of up to 24 hours. You select the number of hours from
the list.
• Daily. Runs the subscription at the same hour every day.
• Weekly. Runs the subscription every week on one or more days at the same hour.
• Monthly. Runs the subscription every month on a specific date or a specific day at the same
hour.
Retry Policy
Defines the number of times Cloud Integration Hub retries to run the subscription in case of failure
and the retry interval. Applies to subscriptions that trigger a Data Integration task. Configure the
following parameters:
Subscription properties 85
Chapter 9
Cloud Integration Hub generates file events for files that it receives and sends.
Cloud Integration Hub generates events as it processes publications and subscriptions, and it changes the
status of the events as they go through the process. You can view all events on the Events page. From the
Events page you can access the event history, session log, and processing information, and reprocess events
or change the event status. You can use filters to search for specific events.
If your organization uses both Data Integration Hub and Cloud Integration Hub, you can view Data Integration
Hub publication and subscription events on the Events page in Cloud Integration Hub. To set up Cloud
Integration Hub to show Data Integration Hub events, see “Setting up Cloud Integration Hub to show Data
Integration Hub events” on page 29.
You can create rules that monitor publication and subscription events, and perform actions on events that
are in a defined status. For example, you can create rules to perform the following tasks:
The Publication event is the root event and the parent event for all of the subscription events that Cloud
Integration Hub generates during processing. After the published data is ready for subscribers, Cloud
Integration Hub generates a Subscription child event for each subscriber that needs to consume the
published data. The Publication event contains aggregated status information for all Subscription child
events.
By default, the Events page displays root events: Publication, File, Aggregated Subscription, and Compound
Subscription. After a publication is ready for subscribers, you can drill down to the associated Subscription
child events of the publication.
86
Event Types
Cloud Integration Hub assigns the following event types to publication and subscription events:
• Publication. Assigned to a publication process. Acts as the parent event for all Subscription events and
for File events of publications that publish multiple files.
• Subscription. Assigned to a subscription process. Acts as a child event for a publication event.
• Compound Subscription. Assigned to a subscription process that consumes data sets from multiple
topics with a single subscription mapping. The event contains references to all Subscription events that
Cloud Integration Hub creates when each topic publication finished publishing the data set.
• Unbound Subscription. Assigned to a subscription process that is not restricted to specific publication
instances but subscribes to all the data that a publication publishes regardless of when or in what batch
the data was published.
• Aggregated Subscription. Assigned to a subscription process that consumes multiple data sets from the
same topic with a single subscription mapping. The event contains references to all Subscription events
that were created when the associated topic finished publishing each data set. The Subscription events
inherit their status from the Aggregated Subscription event.
• System. Event generated for system notifications. For example, Cloud Integration Hub generates a system
event when a compound subscription cannot consume published data from all required publications.
Event Statuses
For publications, Cloud Integration Hub assigns the following event statuses:
For subscriptions, Cloud Integration Hub assigns the following event statuses:
• Delayed. Indicates that the published data is ready but that the subscribing application did not start
consuming the data.
• Processing. Indicates that the subscription instance is running.
• Completed. Indicates that the subscription instance finished running and that the subscribing application
consumed all published data.
• Error. Indicates that the subscription instance encountered errors and did not finish running.
When you hover over the Event Status icon on the Events page, event details appear. For example, the time
when the event processing completed, the time when the event changed status, or the cause of the error in
Error events.
Event History
You can view the event status history for each publication or subscription that the Cloud Integration Hub
processes.
The event history shows the processing stages that the publication or subscription passed through, when
each stage started, and the cumulative processing status.
The following table describes the processing stages that can show in the Event History for publications:
Stage Description
Complete The publication instance finished running and data is ready for subscribers.
Error The publication instance encountered errors and did not finish running.
The following table describes the processing stages that can show in the Event History for subscriptions:
Stage Description
Delayed The subscription instance is delayed. Published data is ready but the subscribing application did not
start consuming the data.
Complete The subscription instance finished running and the subscribing application consumed all published
data.
Error The subscription instance encountered errors and did not finish running.
You can access the task session log from the specific event.
If an error occurs during file processing, you can use the related session log to view further information about
the error.
You can access the task processing information from the specific event.
You can access the report from the Actions menu of the event.
Event Filters
You can use filters to narrow the view of the Events page to show events for event ID, type or status, show
events for a selected application, topic, publication, or subscription, or show events for a selected time
frame.
You can click the Filter icon to expand the filter pane and define the filtering criteria. After you click Apply
Filter, the event list updates to show the relevant events.
By default the event list shows all events from last 24 hours. After you filter the view of the list, to restore the
default view, click Restore Defaults.
Managing Events
Reprocess an event and change the status of an event.
Note: You can only perform these operations on Cloud Integration Hub events.
Reprocessing an Event
Use the Events page to reprocess events. You can reprocess only subscription events, to re-consume data
that was already consumed.
Event Properties
Event properties include general information about the event, the applications, topic, and task to use for the
event, and event scheduling.
Event ID
ID of the event.
By default, the Events page shows only parent events. To show the list of subscription events for a
publication event, expand the publication event.
Asset Source
This filter appears when there are Data Integration Hub events on the Events page.
Application
For publication events, the application that publishes the data. For subscription events, the application
that consumes the data.
Name of the publication of subscription for which Cloud Integration Hub generates the event.
Topic
For publication events, the topic that the application publishes the data to. For subscription events, the
topic or topics from which the application consumes the data.
Start Time
Event Status
Consumption Status
Applicable for publication events. Data consumption status for the event.
Click the asset in a Data Integration Hub event on the Cloud Integration Hub Events page to open Data
Integration Hub in a new tab and view the Data Integration Hub asset that generated the event.
Event Monitors
You can create event monitors that track publications and subscriptions based on their event status, and
instigate actions when an event is in a defined status.
You create monitoring rules that define which entities to monitor, what are the event statuses for which to
take action, and what actions Cloud Integration Hub takes when an event reaches the defined status.
You can create rules that monitor publication and subscription events, and perform actions on events that
are in a defined status. For example, you can create rules to perform the following tasks:
Monitoring Rules
A monitoring rule defines which assets to monitor, the event statuses that trigger actions, and the actions to
take when an event is in a defined status.
When you create a monitoring rule, you define the following elements:
• Asset or assets to which the rule applies. A rule can apply to a single publication, to multiple publications,
or to all current and future publications, or to a single subscription, to multiple subscriptions, or to all
current and future subscriptions
• Event status or statuses to which the rule applies. Cloud Integration Hub applies the rule only to events
that are in a final state.
Event Monitors 91
• Rule action or actions. You can select one or more of the following actions:
- Send email notification. You define the user or users to which Cloud Integration Hub sends an email
notification when the rule conditions are true.
- Pause subscriptions or disable publications and subscriptions that are in the status or statuses to which
the rule applies.
1. In the Navigator, click New > Monitoring Rule. Then click Create.
The New Monitoring Rule page appears.
2. Enter the rule name. Optionally, enter a description for the rule.
3. Select the location to save the rule.
4. Choose the rule mode, enabled or disabled. A disabled rule does not perform the defined actions.
5. Select the type of asset that the rule affects, publication or subscription, and then select the asset or
assets to which the apply the rule. You must apply the rule to at least one publication or one
subscription.
• To apply the rule to all publications or to all subscriptions, including current publications or
subscriptions and publications or subscriptions that are added to Cloud Integration Hub after you
create the rule, select Apply to all.
• To select a single publication or a single subscription to which to apply the rule, select the check box
to the left of the publication name or the subscription name.
• To select multiple publications or multiple subscriptions to which to apply the rule, select multiple
check boxes to the left of the publication names or the subscription names.
6. Select the event statuses to monitor. You must select at least one status.
7. Select one or both of the following rule actions:
Send email notification
Send email notifications when a publication or a subscription is in one of the affected statuses. You
can send notifications to existing Cloud Integration Hub users or to email addresses that you
specify. You can define up to 30 email recipients.
Perform the following steps for each user:
1. Click Add to the right of Send email notification.
2. Select the name of an existing user or select a non-existing user from the User Name list and
then enter the email address in the Email field.
Cloud Integration Hub sends email notifications to the recipients that you define here when events
of any of the affected publications or subscriptions are in any of the affected statuses.
1. In the Navigator, click Explore. Click the All Assets list and then select Monitors > Monitoring Rules.
The Explore page shows all existing monitoring rules.
2. Click the name of the monitoring rule to edit.
The monitoring rule page shows.
3. Edit the monitoring rule and then click Save.
1. In the Navigator, click Explore. Click the All Assets list and then select Monitors > Monitoring Rules.
The Explore page shows all existing monitoring rules.
2. In the row that contains the rule, click Actions and select one of the following actions:
• To disable a rule select Disable. A disabled rule does not perform the defined actions.
• To enable a disabled rule select Enable.
Event Monitors 93
Monitoring Rule Properties
Monitoring rule properties include general information about the monitoring rule, the asset or assets to which
the rule applies, the event statuses that the rule monitors, and the rule action or actions.
Rule Name
Name of the monitoring rule. The name can contain up to 60 characters and can contain special
characters.
Description
Description of the monitoring rule. The description can contain up to 255 characters.
Mode
Monitoring rule mode, enabled or disabled. A disabled rule does not perform the defined actions.
Content
Affected Assets
Affected Statuses
The statuses of the affected assets that the rule applies to.
Actions
The actions that the rule performs when any of the affected assets are in any of the affected statuses.
Starts a publication or a subscription, including disabled publications and subscriptions, and returns the
event ID of the publication or the subscription event that Cloud Integration Hub generates.
You can use the Run Publication Subscription REST API to publish data and subscribe to data with
publications and subscriptions that trigger a Data Integration task. You cannot use the API to publish
data with publications that publish data directly to a topic or to consume data with subscriptions that
consume data directly from a topic.
Publish Data
Publishes data directly to a topic on the Cloud Integration Hub publication repository. Returns the status
of a publication process.
You can use the Publish Data API to publish data with publications that publish data with an API. You
cannot use the API with publications that trigger a Data Integration task.
Consume Data
Consumes data directly from a topic on the Cloud Integration Hub publication repository. You can use
the Consume Data API to consume data with subscriptions that consume data with an API. You cannot
use the API with subscriptions that trigger a Data Integration task.
If the subscription process fails, you can attempt to consume the published data by reprocessing the
subscription Error event with the API.
You can reconsume data that had previously been processed by triggering the subscription Complete
event with the API.
Changes the mode of a publication or a subscription, that is, enables a disabled publication or
subscription and disables an enabled publication or subscription.
Reprocess Event
You can use the Reprocess Event REST API to reprocess events of subscriptions that trigger a Data
Integration task. You cannot use the API to reprocess events of subscriptions that consume data with an
API.
95
Event Status
Catalog
Extracts data from the Cloud Integration Hub catalog, including topic, publication, and subscription
metadata.
Authorization Header
Each Cloud Integration Hub REST API call must contain an authorization header.
The type of the authorization header must be Basic, and the header must include an Informatica Intelligent
Cloud Services user and an Informatica Intelligent Cloud Services password.
For example:
{
Username: [email protected]
Password: MyPassword
}
Note: You can use the Run Publication Subscription REST API to publish data and subscribe to data with
publications and subscriptions that trigger a Data Integration task. You cannot use the API to publish data
and subscribe to data with data-driven publications and subscriptions.
The Run Publication Subscription API returns the response code of the action that you perform. If the
publication or subscription runs successfully, the API returns the event ID of the publication or the
subscription event that Cloud Integration Hub generates. You can run the Cloud Integration Hub Event Status
API to query the status of the publication or subscription event.
• <pod> is the name of the Informatica Intelligent Cloud Services point of delivery (PoD) where you access
Cloud Integration Hub. For example: cih-pod1, or emw1-cih.
• <baseUrl> is the Informatica Intelligent Cloud Services URL. For example: dm-
us.informaticacloud.com/.
• <pod> is the name of the Informatica Intelligent Cloud Services point of delivery (PoD) where you access
Cloud Integration Hub. For example: cih-pod1, or emw1-cih.
• <baseUrl> is the Informatica Intelligent Cloud Services URL. For example: dm-
us.informaticacloud.com/.
For example:
https://cih-pod1.dm-us.informaticacloud.com/dih-console/api/v1/subscription/start
Request syntax for running a publication
For example:
{
"publicationName": "daily_sales",
"runDisabled": "true"
}
Tip: You can copy the values of <pod> and <baseUrl> from the Cloud Integration Hub URL after you
access it from the My Services page of Informatica Intelligent Cloud Services.
For example:
{
"subscriptionName": "daily_report",
"runDisabled": "true"
}
Running a publication or a subscription from the REST API returns one of the following response codes:
• SUCCESS. Cloud Integration Hub triggered the publication or the subscription successfully. The status
message includes the event ID of the publication or the subscription event that Cloud Integration Hub
generates.
• FAILED. Cloud Integration Hub could not trigger the publication or the subscription. The response
provides the reason for the failure. For example, Cloud Integration Hub did not run the subscription
because no publications are ready for consumption by the subscription.
You can use the Publish Data API to publish data with publications that publish data directly to a topic with
an API. You cannot use the API with publications that trigger a Data Integration task.
To publish data through the API, copy the URL of the API from the Publication page in Cloud Integration Hub.
Note: When you use a private publication repository, if you change the Secure Agent on which the publication
repository service runs or the port number of the publication repository, the URL of the API changes
accordingly. In this case, be sure to notify API users and consumers of the new URL.
Request Headers
For example:
{
"Sales":
If the topic to which you publish includes a DATETIME field, you must use the following format for the
DATETIME value: yyyy-MM-dd HH:mm:ss.SSS.
Publishing data through the REST API returns one of the following response codes:
• SUCCESS. Cloud Integration Hub published the data successfully. The status message includes the event
ID of the publication event that Cloud Integration Hub generates, the number of row accepted, and the
number of rows successfully processed.
• FAILED. Cloud Integration Hub could not publish the data.
Note: When you publish data through the Publish Data REST API to a private publication repository and the
publication fails because the publication repository service is not accessible, Cloud Integration Hub
returns an error to the calling application and does not create an error event.
To view the Swagger structure, copy the URL of the structure from the Publication page in Cloud Integration
Hub.
• Consume data from a topic on the Cloud Integration Hub publication repository.
• Reconsume data that had previously been processed by triggering the subscription Complete event.
• Reprocess a subscription Error event to consume published data if a subscription process fails.
You can't use the API with subscriptions that trigger a Data Integration task.
Note: When you use a private publication repository, if you change the Secure Agent on which the publication
repository service runs or the port number of the publication repository, the URL of the API changes
accordingly. In this case, be sure to notify API users and consumers of the new URL.
Request headers
Include the following headers in the Consume Data REST API request:
Accept - application/json
Content-Type - application/json
To support UTF-8 character encoding, for example, to use Japanese characters in table and column names,
include the following headers in the request:
Accept: application/json;charset=utf-8
Accept-Charset: charset=utf-8
Content-Type: application/json;charset=utf-8
Request body
The syntax of the Consume Data REST API request body varies, based on the action you perform with the
API.
Consume data
• true. The subscription consumes all the available publications in each API call.
• false. The subscription consumes only the oldest publication in each API call.
For example:
{
"aggregated": true
}
When you run multiple publications, you can add the event ID of a specific publication to the request
body to consume only the data of the specific publication event. You can add only one event ID to the
request body.
To add the event ID of a specific publication event to the request, use the following syntax:
{
"publicationEventId" : "<eventId>"
}
For example:
{
"publicationEventId" : "594210"
}
Reconsume data
To reconsume data that had previously been processed, use the following request syntax:
{
"requestType" : "RECONSUME",
"eventId" : "<eventId>"
}
Consuming data through the REST API returns one of the following response codes:
SUCCESS
FAILURE
Cloud Integration Hub could not consume the data. For example, if is there is no pending data for the
subscription to consume. The response includes a description of the error that caused the failure.
To view the Swagger structure, copy the URL of the structure from the Subscription page in Cloud Integration
Hub.
• <pod> is the name of the Informatica Intelligent Cloud Services point of delivery (PoD) where you access
Cloud Integration Hub. For example: cih-pod1, or emw1-cih.
• <baseUrl> is the Informatica Intelligent Cloud Services URL. For example: dm-
us.informaticacloud.com/.
For example:
https://cih-pod1.dm-us.informaticacloud.com/dih-console/api/v1/publication/changemode
To change the mode of a subscription, use the following REST URL:
https:https://<pod>.<baseUrl>/dih-console/api/v1/subscription/changemode
Where :
• <pod> is the name of the Informatica Intelligent Cloud Services point of delivery (PoD) where you access
Cloud Integration Hub. For example: cih-pod1, or emw1-cih.
• <baseUrl> is the Informatica Intelligent Cloud Services URL. For example: dm-
us.informaticacloud.com/.
For example:
https://cih-pod1.dm-us.informaticacloud.com/dih-console/api/v1/subscription/changemode
Tip: You can copy the values of <pod> and <baseUrl> from the Cloud Integration Hub URL after you access it
from the My Services page of Informatica Intelligent Cloud Services.
Changing the mode of a publication or a subscription from the REST API returns one of the following
response codes:
• When Cloud Integration Hub changes the mode of the publication or the subscription successfully, the API
returns a SUCCESS response.
• When Cloud Integration Hub fails to change the mode of the publication or the subscription, the response
provides the reason for the failure. For example, when you do not have sufficient privileges to perform the
operation.
• <pod> is the name of the Informatica Intelligent Cloud Services point of delivery (PoD) where you access
Cloud Integration Hub. For example: cih-pod1, or emw1-cih.
• <baseUrl> is the Informatica Intelligent Cloud Services URL. For example: dm-
us.informaticacloud.com/.
For example:
https://cih-pod1.dm-us.informaticacloud.com/dih-console/api/v1/event/reprocess
Use the following syntax to reprocess an event:
{
"eventId" : "<eventId>"
}
For example:
{
"eventId" : "40558"
}
Property Description
reprocessEventId New event ID that Cloud Integration Hub generates for the subscription when it reprocesses the
existing event.
message Error message. If the response code is 0 (success), the API returns the message null.
The manner in which Cloud Integration Hub returns the event ID depends on the API that you use to run the
publication or the subscription:
• When you run the REST API, Cloud Integration Hub returns the event ID in the REST API response.
• When you run the command line API, Cloud Integration Hub returns the event ID in the command line
notification.
You can use the Cloud Integration Hub Event Status REST API to query the status of the publication or
subscription event according to the event ID. You can see whether the publication or subscription process is
still running, and after the process is complete, you can see whether it completed successfully. If the process
fails, the response to the query includes the cause of the failure.
Note: For a list of event statuses, see “Event Statuses” on page 87.
To query the status of an event, use a GET command with the following REST URL:
https://<pod>.<baseUrl>/dih-console/api/v1/event/<eventId>
Where:
• <pod> is the name of the Informatica Intelligent Cloud Services point of delivery (PoD) where you access
Cloud Integration Hub. For example: cih-pod1, or emw1-cih.
• <baseUrl> is the Informatica Intelligent Cloud Services URL. For example: dm-
us.informaticacloud.com/.
For example:
https://cih-pod1.dm-us.informaticacloud.com/dih-console/api/v1/event/2435
Property Description
eventId ID of the event that Cloud Integration Hub generates for the publication or for the
subscription.
eventType Type of the event that Cloud Integration Hub generates for the publication or for the
subscription.
topicName Name of the topic that is associated with the publication or with the subscription.
eventStatus Status of the event that Cloud Integration Hub generates for the publication or for the
subscription.
eventStartTimeLong Time when the publication or the subscription event started. System time in milliseconds
as returned by Java API java.lang.System.currentTimeMillis.
eventEndTimeLong Time when the publication or the subscription event ended. System time in milliseconds as
returned by Java API java.lang.System.currentTimeMillis.
referencedEventsList Applicable for file publication events, aggregated subscription events, and compound
subscription events. List of event IDs that are related to the file publication, the aggregated
subscription, or the compound subscription event.
For example, the referencedEventsList of a file publication event includes the file events of
the files that are published as part of the publication event.
sourceSuccessRows Number of source rows that Cloud Integration Hub read successfully.
sourceFailedRows Number of source rows that Cloud Integration Hub failed to read.
targetFailedRows Number of target rows that Cloud Integration Hub failed to write.
targetSuccessRows Number of target rows that Cloud Integration Hub wrote successfully.
detailedMessage Applicable for events in an Error status. If the error is caused by Cloud Integration Hub,
detailedMessage returns the error message from the Cloud Integration Hub event. For
any other error, for example an authentication failure or an incorrect REST URL request,
detailedMessage includes a message that describes the cause of the error.
{
"responseCode": "SUCCESS",
"eventId": 4003,
"eventType": "Publication",
"topicName": "top_120",
"publicationName": "ng_pub_120_1",
"applicationName": "app1",
"eventStatus": "Complete",
"eventStartTimeLong": 1431078308560,
"eventEndTimeLong": 1431078313780,
"isFinal": true,
"isError": false,
"sourceSuccessRows": 10,
"sourceFailedRows": 0,
"targetFailedRows": 0,
"targetSuccessRows: 10}
Response to a request to query the status of aggregated subscription event 3009, which includes
subscription events 3008 and 3007:
{
"responseCode": "SUCCESS",
"eventId": 3009,
"eventType": "Aggregated Subscription",
"topicName": "topic1",
"subscriptionName": "sub1",
"applicationName": "app1",
"eventStatus": "Complete",
"eventStartTimeLong": 1431065700088,
"eventEndTimeLong": 1431065704372,
"referencedEventsList": "3008,3007"
"isFinal": true,
"isError": false,
"sourceSuccessRows": 15,
"sourceFailedRows": 0,
"targetFailedRows": 0,
"targetSuccessRows: 15
}
Response to a request to query the status of publication event 3016, where the publication process failed:
Response:
{
"responseCode": "SUCCESS",
"eventId": 3016,
"eventType": "Publication",
"topicName": "top_120",
"publicationName": "ng_pub_120_1",
"applicationName": "app1",
"eventStatus": "Error",
"eventStartTimeLong": 1431066353202,
You can extract metadata pertaining to topics, publications, and subscriptions for which you have both View
and Read privileges.
To extract data from the catalog, use the following REST URL:
https://<pod>.<baseUrl>/dih-console/api/v1/catalog/topics
Where:
• <pod> is the name of the Informatica Intelligent Cloud Services point of delivery (PoD) where you access
Cloud Integration Hub. For example: cih-pod1, or emw1-cih.
• <baseUrl> is the Informatica Intelligent Cloud Services URL. For example: dm-
us.informaticacloud.com/.
For example:
https://cih-pod1.dm-us.informaticacloud.com/dih-console/api/v1/catalog/topics
Tip: You can copy the values of <pod> and <baseUrl> from the Cloud Integration Hub URL after you access it
from the My Services page of Informatica Intelligent Cloud Services.
The string includes the following data for each topic in the response:
topicName
topicDesc
topicType
For each table in the topic, an entry with the table name and detailed information about each of the table
fields.
publications
For each publication that is associated with the topic, the following data is provided:
publicationName
publicationDesc
applicationName
publicationSourceType
publicationConnectionName
For relational database publications and for HDFS publications: name of the connection from where
the publication workflow reads the data or the files to be published.
publicationDBType
subscriptions
For each subscription that is associated with the topic, the following data is provided:
subscriptionName
subscriptionDesc
applicationName
subscriptionTargetType
subscriptionConnectionName
For relational database subscriptions and for HDFS subscriptions: name of the connection to where
the subscription workflow writes the data or the files that the application consumes.
subscriptionDBType
{
"responseCode": "SUCCESS",
"catalogTopics": [
{
"topicName": "FileTopic",
"topicDesc": null,
},
],
"publications": [
{
"publicationName": "OrdersPublication",
"publicationDesc": null,
"applicationName": "OrderPublications",
"publicationSourceType": "CUSTOM",
"publicationConnectionName": " null",
"publicationDBType": " null"
}
],
"subscriptions": [
{
"subscriptionName": "OrdersSubscription",
"subscriptionDesc": null,
"applicationName": "OrderSubscriptions",
"subscriptionTargetType": "CUSTOM",
"subscriptionConnectionName": null,
"subscriptionDBType": null
},
{
"subscriptionName": "OrderSubs",
"subscriptionDesc": null,
"applicationName": "OrderSubscriptions",
"subscriptionTargetType": "CUSTOM",
"subscriptionConnectionName": " null",
Glossary
aggregated subscription
A subscription that consumes multiple data sets from the same topic with a single batch workflow. An
aggregated subscription can use an automatic mapping or a custom mapping to process data. When you use
an automatic mapping, the subscription sorts the data according to the publication date and time of the
publication instances.
application
An entity that represents a system in your organization that needs to share data with other systems. An
application can be a publisher and a subscriber. An application can publish multiple data sets.
child event
An event within the hierarchy of another event that acts as a parent event. The child event is a subsidiary of
the parent event.
compound subscription
A subscription that consumes data sets from multiple topics with a single synchronization task.
When you use a Data Integration task to process publications, you use the Cloud Integration Hub cloud
connector as the publication target. When you use a Data Integration task to process subscriptions, you use
the Cloud Integration Hub cloud connector as the subscription source.
event
An occurrence of a publication or subscription at each stage of processing. The Cloud Integration Hub server
generates the event and updates the event status while it processes the publication or subscription.
parent event
An event at the top level of a hierarchy of events.
113
publication
An entity that defines data flow from a data source to the Cloud Integration Hub publication repository and
the data publishing schedule. The publication publishes the data to a topic that defines the structure of the
data in the publication repository. When a publication runs, Cloud Integration Hub extracts the data set from
the application, processes the data, and writes the data to the publication repository. You can then create one
or more subscriptions to process and write the published data set to target applications.
publication repository
A relational database table set that stores published data sets that subscribers can consume. Cloud
Integration Hub stores the data in the publication repository in the following ways:
subscription
An entity that defines the type, format, and schedule of data flow from the Cloud Integration Hub publication
repository to a data target. When a subscription runs, Cloud Integration Hub extracts the data set from the
publication repository, processes the data, and writes the data to the target application. You can subscribe to
one or more topics. Each topic to which you subscribe can contain data from multiple publishers.
topic
An entity that represents a data domain that applications publish and consume through Cloud Integration
Hub. A topic defines the data structure and additional data definitions such as the data retention period.
Multiple applications can publish to the same topic. Multiple applications can consume data from the same
topic.
unbound subscription
A subscription that is not restricted to specific publication instances. It subscribes to all the data that a
publication publishes and consumes the data based on the subscription filter, regardless of when or in what
batch the data was published.
A cloud mapping
creating 66, 67
administration publication 66
Salesforce Accelerator package 25 subscription 67
API cloud task
authorization header 96 definition 60
consume data 99–102 types 60
publication subscription mode 102 connection
publish data 98, 99 topic table 56
reprocess event 103 consume data
run publication subscription 96 REST API 99, 101, 102
application consume data API
add publication 45 response 101
add subscription 45 REST API 100
creating 45 Swagger 102
definition 45 consume data REST API
editing 32 request 100
managing 45 create
properties 46 application 45
viewing 31 data synchronization task 62, 64
architecture monitoring rule 92
components 13 publication 71, 72
asset subscription 79, 81
export 38 topic 52
import 38 topic table 56–58
migration 38 create topic table
assets from connection 56
assigning tags 43 from flat file 57
creating tags 42 from metadata file 57
deleting 33 manually 58
dependent 39 creating
export 39 tags 42
import 39
migrate 40
moving 32
tags 41
D
Data Integration tasks
description 19
C data synchronization
publication 62
catalog API data synchronization task
description 107 creating 64
response 107 creating a 62
catalog API response publication 62
example 108 subscription 64
change mode API deleting
REST API 103 assets 33
change mode REST API folders 33
action status 103 projects 33
change mode REST API action tags 44
status 103 dependencies
change status viewing dependencies 41
event 90 dependent
cloud assets 39
task 60 disable
monitoring rule 93
115
disable (continued)
publication 73 G
subscription 82 get previous publications 82
glossary
of terms 113
E
edit
application 32 H
monitoring rule 93 hardware
publication 32 requirements 14
subscription 32 Hub Overview
topic 32 diagram 16
editing Hub Overview diagram
tags 44 description 16
enable filters 16
monitoring rule 93
publication 73
subscription 82
error handling I
migration 41 import
event assets 38, 39
changing status 90 Informatica Intelligent Cloud Services
monitoring 91 mappings 60
properties 90 tasks 60
reprocessing 89 intermediate staging
tracking 91 policy 28
event monitoring intermediate staging policy
overview 86 subscription 28
event status API
process status 104–106
response 105, 106
events J
consumption statuses 88 Java KeyStore
filter 89 private publication repository 28
history 88
managing 89
overview 86
processing information 89 L
publication 86 log in
publications and subscriptions 87, 88 description 18
session logs 89
statuses 87
subscription 86
system 89 M
types 87 mapping
example publication 70
catalog API response 108 subscription 77
Explore page mapping configuration
tags 41 publication 66, 67
export subscription 67, 68
assets 38, 39 mapping task
configuration process 66
creating 67, 68
F usage 65
mappings
filter guidelines 60
events 89 metadata file
filters topic table 48, 57
Hub Overview diagram 16 migrate
flat file assets 40
topic table 57 migration
folders assets 38
deleting 33 error handling 41
moving 32 monitor
events 91
monitoring
overview 86
116 Index
monitoring (continued) publication (continued)
rules 91 events 86
monitoring rule management 71
creating 92 mapping 70
disabling 93 overview 69
editing 93 process 20, 69
enabling 93 properties 74
managing 92 run manually 73
properties 94 running 73
moving schedule 71
assets and folders 32 source 71
type 69
viewing 31
N publication event
history 88
Navigator publication process
description 17 publish with API 70
network task-triggering publications 70
requirements 14 publication repository
private 28
Publication Repository Service
R
P reprocess
permissions event 89
best practices 36 reprocess event
configuring for objects 37 REST API 103
for copied assets 35 reprocess event API
for imported assets 35 REST API 104
overview 35 reprocess event API REST API
permission descriptions 35 action status 104
rules and guidelines 36 reprocess event API REST API action
policy status 104
intermediate staging 28 requirements
private publication repository hardware 14
Java KeyStore 28 network 14
port number 28 Product Availability Matrix 14
Publication Repository Service 27 proxy 14
privileges REST API
Cloud Integration Hub users 34 authorization header 96
Data Integration users 34 consume data 99–102
Product Availability Matrix 14 publication subscription mode 102
project folders 17 publish data 98, 99
projects reprocess event 103
deleting 33 run publication subscription 96
proxy REST APIs
requirements 14 description 95
publication retry policy
add to application 45 subscription 78
creating 71, 72 rule
definition 20 monitoring 91
disabling 73 run
editing 32 publication 73
enabling 73 subscription 82
Index 117
run publication subscription API synchronization task
process status 104–106 usage 62
REST API 96, 98 system event
run publication subscription process maintenance report 89
status 104–106 system requirements 14
run publication subscription REST API
action status 98
request 96
run publication subscription REST API action
T
status 98 tags
run publication subscription status assigning 43
event status API 104–106 creating 42
deleting 44
editing 44
S properties 44
targets
Salesforce subscription 78
Accelerator package 25, 26 tasks
Salesforce Accelerator package deleting 33
before you begin 25 guidelines 60
components 25 topic
deploying 26 creating 52
schedule definition 47
publication 71 diagram 53
subscription 78 editing 32
source management 51
publication 71 overview 18
subscribe to properties 53
topic 53 publication repository 19
subscription structure 47, 48, 56–58
add to application 45 subscribing to 53
creating 79, 81 table 48
definition 20 tables 47, 56–58
disabling 82 viewing 31
editing 32 Topic Diagram
enabling 82 description 53
events 86 topic structure
get previous publications 82 Swagger 99, 102
intermediate staging policy 28 topic table
management 79 create 56–58
mapping 77 metadata file 48
overview 76 track
process 20, 76 events 91
properties 83
retry policy 78
run manually 82
running 82
U
schedule 78 user interface
targets 78 description 15
type 76 user roles
viewing 31 Cloud Integration Hub users 33
subscription event
history 88
subscription process
consume with API 77
V
task-triggering subscriptions 77 view
subscriptions application 31
event consumption statuses 88 publication 31
event statuses 87 subscription 31
event types 87 topic 31
Swagger
topic structure 99, 102
118 Index