Bridge Specifications
Vendor | SAP |
Tool Name | Business Warehouse 4 HANA (BW/4HANA) |
Tool Version | 7.3 to 7.5 |
Tool Web Site | https://www.sap.com/products/business-warehouse.html |
Supported Methodology | [Business Application] Multi-Model, BI Design (OLAP Source) via JCO API |
SPECIFICATIONS
Tool: SAP / Business Warehouse 4 HANA (BW/4HANA) version 7.3 to 7.5 via JCO API
See https://www.sap.com/products/business-warehouse.html
Metadata: [Business Application] Multi-Model, BI Design (OLAP Source)
Component: SapBw version 11.0.0
OVERVIEW
This bridge imports SAP NetWeaver Business Warehouse (BW) metadata.
REQUIREMENTS
The bridge relies on the SAP Java Connector (JCo) API libraries to connect and retrieve metadata. Therefore, the JCo libraries must be available on the machine executing this bridge.
The API communicates with the SAP server over the local network, and the following server ports may be used:
- Dispatcher port: 32NN used by SAP GUI for Windows and Java
- Gateway port: 33NN used for CPIC and RFC communications
- SNC secured Gateway port: 48NN used for CPIC and RFC encrypted communications
(where NN is your SAP Instance number from 00 to 99).
Make sure that your firewall settings allow communications on these ports.
Before using the bridge, you must configure the SAP BW server by deploying an ABAP RFC function module.
You need an ABAP developer account to create the RFC FunctionModule on the SAP server.
This module responds to queries from the bridge, to retrieve the necessary metadata.
Supplemental documentation is available explaining how to deploy the RFC function module on the server at:
<InstallDir>/conf/MIRModelBridgeTemplate/SapBw/
The user account requires sufficient permissions to connect to the SAP server and execute the following RFC function modules:
- STFC_CONNECTION (check connectivity)
- RFC_SYSTEM_INFO (check system information)
- Z_MITI_BW_DOWNLOAD (main metadata import)
Retrieving metadata from the SAP server may take a few hours depending on the volume of metadata, workload of the SAP server, and speed of network between the SAP server and the local machine.
Using the bridge 'Miscellaneous' parameter it is possible to store the downloaded metadata as text files in a local directory.
Using the bridge 'Offline metadata directory' parameter it is possible to read the metadata from previously downloaded text files, to speed up bridge execution, avoiding the delay of downloading from the server again.
FREQUENTLY ASKED QUESTIONS
n/a
LIMITATIONS
Refer to the current general known limitations at http://metaintegration.com/Products/MIMB/MIMBKnownLimitations.html or bundled in Documentation/ReadMe/MIMBKnownLimitations.html
SUPPORT
Provide a trouble shooting package with:
- the debug log (can be set in the UI or in conf/conf.properties with MIR_LOG_LEVEL=6)
- the metadata backup if available (can be set in the Miscellaneous parameter with option -backup)
Q: How do I provide metadata to the support team to reproduce an issue?
A: Configure the bridge parameter 'Incremental import=False' and configure the bridge 'Miscellaneous' parameter to save SAP metadata as text files into a local directory.
Compress the resulting files in a ZIP archive and send to the support team.
Bridge Parameters
Parameter Name | Description | Type | Values | Default | Scope | |||||||||||||||||||||
Application server | Enter here the name or IP address of the SAP Application Server Host to connect to. | STRING | Mandatory | |||||||||||||||||||||||
Router string | Enter here the SAP router string to use for a system protected by a firewall. | STRING | ||||||||||||||||||||||||
System number | Enter here the SAP system number (instance ID of the ABAP instance). This is a two digit integer between 00 and 99. | NUMERIC | Mandatory | |||||||||||||||||||||||
Client | Enter here the SAP system client id. The client is identified with three digit numeric number from 000 to 999. | NUMERIC | Mandatory | |||||||||||||||||||||||
User name | Enter here your logon user name, it must be a valid user name on the SAP system. | STRING | Mandatory | |||||||||||||||||||||||
Password | Enter here your logon user password. | PASSWORD | Mandatory | |||||||||||||||||||||||
Default Language | Specify the language you would like to use. | ENUMERATED |
|
DE | ||||||||||||||||||||||
JCo library path | The bridge reads metadata from SAP using the Java Connector (JCo) 3.0 API. Specify in this parameter the directory path where the JCo libraries are located. This directory should contain for example on Microsoft Windows: - sapjco3.jar - sapjco3.dll Different versions of the JCo libraries are available from SAP, for various operating systems and processor architectures. Make sure that you are using the correct JCo distribution for your environment. For example, if you use a 32bits Java JVM on a 32 bits Windows platform, you should use the 32bits JCo libraries for Intel x86 processor. For download, licensing and other information, please refer to https://support.sap.com/en/product/connectors/jco.html |
DIRECTORY | Mandatory | |||||||||||||||||||||||
Mapping configuration file | Some Transformations in BW are represented with ABAP code. It is necessary to document these transformations in a mapping configuration file. Specify here the path to the configuration file in the following format: <Functions> <Function id="[TransformationId]" objType="TRFN"> <Target id="RESULT_FIELDS-[dstFieldName]" description="RESULT_FIELDS-[dstFieldName] = SOURCE_FIELDS-[srcFieldName] * [externalObjectId]-[externalField]."> <Source id="SOURCE_FIELDS-[srcFieldName]"/> <Source id="[externalObjectId]-[externalField]"/> </Target> </Function> </Functions> |
FILE | *.xml | |||||||||||||||||||||||
Naming convention | Specify the naming rule you want to use. SAP BW metadata objects have a technical name, and optionally a description, and the description may vary depending on the user language. SAP BW tools offer various naming conventions to customize how names are displayed. This parameter allows one to reproduce such conventions, the possible choices are: - [TechnicalName] Description - Description [TechnicalName] - Description_TechnicalName - [TechnicalName] |
ENUMERATED |
|
KTX | ||||||||||||||||||||||
Incremental import | Incremental import only extracts what has changed since the last import. The initial full metadata harvesting (model import) of a very large source system can take a long time. However the extracted metadata are organized as a multi-model, where each model is a unit of change (e.g. Schema of a RDBMS server, or report of BI server). Subsequent model imports are dramatically faster than the initial import as this bridge will automatically try to detect changes in the source system, in order to only process the modified, added or deleted models and reuse all unchanged metadata from the model cache. Note however that the detection of change is more or less efficient depending on the sources system: e.g. BI servers can quickly provide the list of new, modified or deleted reports, but not all data stores offer a schema level change detection. 'True' Import only the changes made since the last import 'False' import all metadata. This option is required after upgrading the bridge in particular to take full advantage of any additional metadata coverage. For debugging purpose, the option -cache.clear of the Miscellaneous parameter can be used to clear one model from the cache which is located (by default) in: $HOME/data/MIMB/cache/<BridgeId>/<ModelId> |
BOOLEAN |
|
True | ||||||||||||||||||||||
Offline metadata directory | In order to facilitate testing and reproducing SAP metadata environment, when that environment is not installed locally, this parameter allows importing metadata from files previously downloaded from the SAP server. Specify in this parameter the directory path where the downloaded files are located. No connection to the SAP server is needed in this case, the usual connection parameters are ignored. | DIRECTORY | ||||||||||||||||||||||||
Miscellaneous | INTRODUCTION Specify miscellaneous options starting with a dash and optionally followed by parameters, e.g. -connection.cast MyDatabase1="SQL Server" Some options can be used multiple times if applicable, e.g. -connection.rename NewConnection1=OldConnection1 -connection.rename NewConnection2=OldConnection2; As the list of options can become a long string, it is possible to load it from a file which must be located in ${MODEL_BRIDGE_HOME}\data\MIMB\parameters and have the extension .txt. In such case, all options must be defined within that file as the only value of this parameter, e.g. ETL/Miscellaneous.txt JAVA ENVIRONMENT OPTIONS -java.memory <Java Memory's maximum size> (previously -m) 1G by default on 64bits JRE or as set in conf/conf.properties, e.g. -java.memory 8G -java.memory 8000M -java.parameters <Java Runtime Environment command line options> (previously -j) This option must be the last one in the Miscellaneous parameter as all the text after -java.parameters is passed "as is" to the JRE, e.g. -java.parameters -Dname=value -Xms1G The following option must be set when a proxy is used to access internet (this is critical to access https://repo.maven.apache.org/maven2/ (and exceptionally a few other tool sites) in order to download the necessary third party software libraries. -java.parameters -Dhttp.proxyHost=127.0.0.1 -Dhttp.proxyPort=3128 -Dhttps.proxyHost=127.0.0.1 -Dhttps.proxyPort=3128 -Dhttp.proxyUser=user -Dhttp.proxyPassword=pass -Dhttps.proxyUser=user -Dhttps.proxyPassword=pass -java.executable <Java Runtime Environment full path name> (previously -jre) It can be an absolute path to javaw.exe on Windows or a link/script path on Linux, e.g. -java.executable "c:\Program Files\Java\jre1.8.0_211\bin\javaw.exe" -environment.variable <name>=<value> (previously -v) None by default, e.g. -environment.variable var2="value2 with spaces" MODEL IMPORT OPTIONS -model.name <model name> Override the model name, e.g. -model.name "My Model Name" -prescript <script name> The script must be located in the bin directory, and have .bat or .sh extension. The script path must not include any parent directory symbol (..). The script should return exit code 0 to indicate success, or another value to indicate failure. For example: -prescript "script.bat arg1 arg2" -cache.clear Clears the cache before the import, and therefore will run a full import without incremental harvesting. Warning: this is a system option managed by the application calling the bridge and should not be set by users. -backup <directory> Full path of an empty directory to save the metadata input files for further troubleshooting. DATA CONNECTION OPTIONS Data Connections are produced by the import bridges typically from ETL/DI and BI tools to refer to the source and target data stores they use. These data connections are then used by metadata management tools to connect them (metadata stitching) to their actual data stores (e.g. databases, file system, etc.) in order to produce the full end to end data flow lineage and impact analysis. The name of each data connection is unique by import model. The data connection names used within DI/BI design tools are used when possible, otherwise connection names are generated to be short but meaningful such as the database / schema name, the file system path, or Uniform Resource Identifier (URI). The following options allows to manipulate connections. These options replaces the legacy options -c, -cd, and -cs. -connection.cast ConnectionName=ConnectionType Casts a generic database connection (e.g. ODBC/JDBC) to a precise database type (e.g. ORACLE) for SQL Parsing, e.g. -connection.cast "My Database"="SQL SERVER". The list of supported data store connection types includes: ACCESS CASSANDRA DB2 DENODO HIVE MYSQL NETEZZA ORACLE POSTGRESQL PRESTO REDSHIFT SALESFORCE SAP HANA SNOWFLAKE SQL SERVER SYBASE TERADATA VECTORWISE VERTICA -connection.rename OldConnection=NewConnection Renames an existing connection to a new name, e.g. -connection.rename OldConnectionName=NewConnectionName Multiple existing database connections can be renamed and merged into one new database connection, e.g. -connection.rename MySchema1=MyDatabase -connection.rename MySchema2=MyDatabase -connection.split oldConnection.Schema1=newConnection Splits a database connection into one or multiple database connections. A single database connection can be split into one connection per schema, e.g. -connection.split MyDatabase All database connections can be split into one connection per schema, e.g. -connection.split * A database connection can be explicitly split creating a new database connection by appending a schema name to a database, e.g. -connection.split MyDatabase.schema1=MySchema1 -connection.map DestinationPath=SourcePath Maps a source path to destination path. This is useful for file system connections when different paths points to the same object (directory or file). On Hadoop, a process can write into a CSV file specified with the HDFS full path, but another process reads from a HIVE table implemented (external) by the same file specified using a relative path with default file name and extension, e.g. -connection.map hdfs://host:8020/users/user1/folder/file.csv=/user1/folder On Linux, a given directory (or file) like /data can be referred to by multiple symbolic links like /users/john and /users/paul, e.g. -connection.map /users/John=/data -connection.map /users/paul=/data On Windows, a given directory like C:\data can be referred to by multiple network drives like M: and N:, e.g. -connection.map M:\=C:\data -connection.map N:\=C:\data -connection.casesensitive ConnectionName Overrides the default case insensitive matching rules for the object identifiers inside the specified connection, provided the detected type of the data store by itself supports this configuration (e.g. Microsoft SQL Server, MySql etc.), e.g. -connection.casesensitive "My Database" SAP OPTIONS -jco.file <file path> Specify path to file with additional connection details such as the SNC connection details. The bridge appends parameters specified in the bridge configuration to the file before passing it to the BW Java Connector (JCo). |
STRING |
Bridge Mapping
Meta Integration Repository (MIR) Metamodel (based on the OMG CWM standard) |
"SAP Business Warehouse 4 HANA (BW/4HANA)" Metamodel SapBW |
Mapping Comments |
Attribute | DataSource Field | |
Name | Name | |
Position | Position | |
BaseType | Datatype | BaseTypes are inferred from datatypes found in BW |
DataType | Datatype | See datatype conversion arrays |
Name | Based on the datatype | |
Scale | Datatype | extracted from the datatype |
UpperBound | Validation_Rule_Max_Value | |
Class | DataSource | DataSources with multiple segments are not well supported yet (only one segment is assumed) |
CppClassType | Set to ENTITY | |
CppPersistent | Set to True | |
Name | Technical Name | |
ClassifierMap | Used to connect objects together, and represent data lineage | |
Condition | Query Filter, Characteristic restriction | represents data filtering in Queries |
Name | Name | computed from Technical Name and Description |
NativeId | UID | |
PhysicalName | Technical Name | |
Connection | TransferRules, UpdateRules, Transformations | BW objects which move or transfer data from one object to another are imported as Data Mappings |
MatchingRule | Long Description | |
Name | Name | computed from Technical Name and Description |
DatabaseSchema | DataSource | A Schema is created for each DataSource |
Name | Name | computed from Technical Name and Description |
DerivedType | Datatype | |
DataType | Datatype | See datatype conversion arrays |
Length | Length | |
Name | Derived from BW datatype name | |
Scale | Decimal Places | |
Dimension | InfoSource, InfoObject, InfoProvider, Query, QueryView, Reports | Represents the internal structure of main objects in BW |
Comment | Short Description | |
Description | Long Description | |
Name | Name | computed from Technical Name and Description |
PhysicalName | Technical Name | |
UserDefined | set to True | |
DimensionAttribute | Characteristic InfoObject | |
Comment | Short Description | |
Description | Long Description | |
Name | Name | computed from Technical Name and Description |
PhysicalName | Technical Name | |
DirectoryStructureModel | BW system | Represents the BW system |
Name | System ID | |
FeatureMap | Used to connect objects together, and represent data lineage | |
Folder | InfoArea, Application | also used to group objects together |
Author | Owner | |
Description | Long Description | |
LastModificationTime | Last changed timestamp | |
Modifier | Last changed by | |
Name | Name | computed from Technical Name and Description |
Measure | KeyFigure InfoObject | |
Comment | Short Description | |
DefaultAggregation | Aggregation | |
Description | Long Description | |
Name | Name | computed from Technical Name and Description |
PhysicalName | Technical Name | |
OlapSchema | InfoSource, InfoObject, InfoProvider, Query, QueryView, Reports | an OlapSchema is created for each main BW object, to contain its structural details |
Name | Name | computed from Technical Name and Description |
PropertyElementTypeScope | BW specific properties | Used to record some BW specific properties, such as SourceSystem connection parameters |
Scope | property scope | |
PropertyType | BW specific properties | Used to record some BW specific properties, such as SourceSystem connection parameters |
DataType | Type | |
Name | Name | |
PropertyValue | BW specific properties | Used to record some BW specific properties, such as SourceSystem connection parameters |
Value | Value | value set on an object |
StoreContent | InfoSource, InfoObject, InfoProvider, Query, QueryView, Reports | Main BW objects are imported as individual models |
Author | Owner | |
Description | Long Description | |
LastModificationTime | Last changed timestamp | |
Modifier | Last changed by | |
Name | Name | computed from Technical Name and Description |
StoreModel | DataSource, InfoSource, InfoProvider, Query, QueryView, Reports | Main BW objects are imported as individual models |
Author | Owner | |
Comment | Short Description | |
Description | Long Description | |
ModificationTime | Last changed timestamp | |
Modifier | Last changed by | |
Name | Name | computed from Technical Name and Description |
PhysicalName | Technical Name | |
Semantics | Code |