SAP ABAP, SAP HANA Cloud, Data Integration

Data Transfer to GBQ From S/4HANA Private Cloud

In SAP Implementation one of the most critical aspects is reporting. As SAP data is being integrated with many other Non SAP system’s to create reports in systems such as looker, this blog explores the option of how SAP can be Integrated to GBQ using GBQ Connector.

This blog explores the option of data transfer from SAP to GBQ using embedded SLT in S/4HANA.

1. System Considerations:

  • S/4HANA Rise Private Edition
  • S/4HANA Embedded SLT
  • Google Big Query

2. Scenario

As part of this blog, we will be setting up data transmission from SAP S/4HANA system hosted in GCP to GBQ using GBQ Connector and embedded SLT.

3. Infrastructure Setup between SAP and GBQ

https://cloud.google.com/solutions/sap/docs/bq-connector/latest/all-guides

The infrastructure setup can be divided into three part, depending on where the SAP system is hosted. This use case considers S/4HANA Private cloud in Compute Engine virtual machine (VM) on Google Cloud.

3.1 SAP Application server Infrastructure setup

  • Install gcloud CLI in S4HANA OS : If not already available , install gcloud CLI in OS of application server of SAP and verify that it is installed using command gcloud -v

  • Cloud API access scopes for Application VM of SAP : This can be provided in two way, one being full access to all API and the other is only provide Big Query and Cloud Platform API access. This access should be enabled in all application server of SAP system .

  • Enable the host VM to obtain access tokens : The below permission should be provided to the service account associated with the VM of SAP application server.

Service Account Token Creator, BigQuery Data Editor , BigQuery Job User

  • Enable firewall to allow access to Google Cloud APIs : Allow access to API (https://bigquery.googleapis.com ,https://iamcredentials.googleapis.com ) which will be used for RFC connection between SAP and GBQ.

3.2 GCP Infrastructure Setup

As this use case considered SAP Rise system is hosted in GCP, we have 2 steps to perform in the customer’s GCP environment to allow

Data transfer.

  • Create a Big Query Dataset in GBQ: As this process varies across organizations , please create a Big Query Dataset in your environment and take a note of the project name and dataset name . This information will be later used for the SLT configuration.
  • Allow SAP VM Service Account access in Customer GCP: In this use case GBQ Dataset will be created is separate project from SAP VM. The GCP admin where the GBQ dataset was created should provide access to Service account of SAP VM so it can write data into the dataset.

4. Configuration Setup

As the guide covers detailed instructions to be performed, this blog will demonstrate highlights of the configuration in SAP. Feel free to ask any questions in the comment section of the blog for additional information.

  • Set up SSL certificates in STRUST: Download the relevant root certificate (GTS Root R1 ,GTS CA 1C3) for GCP and import into SSL Client PSE of SAP.
  • Specify access settings in /GOOG/CLIENT_KEY: The below values are considered in this setup.

Service Account Name : The Service Account associated with SAP Application Server VM is used

Scope: https://www.googleapis.com/auth/bigquery can be used if the permission were restricted in earlier step

Project ID : The project name created is target GCP which contains the GBQ dataset

  • Create the RFC in SM59 : Create 2 Type G RFC for GCP API and One Type 3 for SAP RFC (Own system ) for data transfer

  • Create SLT Configuration using LTRC

  • Configure /GOOG/SLT_SETTINGS : configure a mass transfer for BigQuery and specify the table and field mappings.

Project Identifier: This is the project name from Target GCP

BQ Dataset: Name of the GBQ Dataset

5. Test Replication:

After the above setup, test the replication using LTRC. All the other relevant configurations for SLT can be used in the above use case as well (LTRS) .

6. Considerations

  • This setup is intended for replication where the data transfer is not in large scale as there will be performance impact to SAP system for embedded SLT scenario.
  • Check with SAP ECS team regarding the security approval required for the changes to the SAP VM access.
  • Advance performance replication for certain larger table such as ACDOCA should be considered using Performance Optimization Guide for SLT
  • As the use case had considered embedded SLT, certain performance optimization option will increase the database table size for temporary logging table.
  • Periodically program CNV_NOTE_ANALYZER_SLT to check for latest available SAP notes which can be applied to improve and fix any bugs associated with SLT.
Rating: 0 / 5 (0 votes)