SAP Cloud

Auto Scaling of SAP Systems on Azure – Part I

In this Article we will discuss about the Auto-Scaling solution on Azure and how we can setup this solution on the SAP systems which are hosted on Azure.

Organizations which are running there SAP workloads on Azure can take advantage of this solution which allow them to make full use of the SAP cloud scalability for SAP Systems.

This article mostly concentrates on the information about configurations of Auto-Scaling solution for the SAP systems which are hosted on Azure Infrastructure.

1. AZURE SERVICES

1.1 Azure Monitors

Azure Monitor helps maximize the availability and performance of SAP applications and services. It delivers a comprehensive solution for collecting, analyzing, and acting on telemetry from cloud and on-premises environments. Here we will be using Azure Monitors to capture the Alerts from Telemetry data which are collected in Log Analytics Workspace.

1.2 Log Analytics Workspace

A Log Analytics workspace is a unique environment for Azure Monitor log data. Each workspace has its own data repository and configuration, and data sources and solutions are configured to store their data in a workspace. Telemetry data for SAP workload will be collected here which will be later used for further actions.

1.3 On-premises Data Gateway

The on-premises data gateway acts as a bridge. It provides quick and secure data transfer between on-premises data (which is data that isn’t in the cloud) and several Microsoft cloud services. These services include Power BI, Power Apps, Power Automate, Azure Analysis Services, and Azure Logic Apps etc..By using a gateway, we can keep databases and other data sources on their on-premises networks while securely using that on-premises data in cloud services. Here we are using On-premises data gateway for connecting SAP system directly from Azure Logic Apps.

1.4 Key Vault

Azure Key Vault is a cloud service for securely storing and accessing secrets. A secret is anything that we want to tightly control access to, such as API keys, passwords, certificates, or cryptographic keys. Vaults support storing software and HSM-backed keys, secrets, and certificates. All the secrets for the SAP systems will be stored in the Key Vault which will be used by the providers to access SAP solutions for collecting the data.

1.5 Managed Identities

Managed identities eliminate the need for services to manage credentials. Managed identities provide an identity for applications to use when connecting to resources that support Azure Active Directory (Azure AD) authentication. Applications may use the managed identity to obtain Azure AD tokens. Managed Identities allow provides to use AD Authentications for retrieving the SAP system secrets from the Key Vaults

1.6 Shared Image Galleries

Using a Shared Image Gallery, we can share our VM images to different users, service principals, or AD groups within and outside your organization. Shared images can be replicated to multiple regions, for quicker scaling of our deployments. Here we are using VM images to scale the SAP system according to the requirement.

1.7 Azure Virtual Machine

Azure Virtual Machines (VM) is one of several types of on-demand, scalable computing resources that Azure offers. VM will be deployed by the Automation RunBook based on the telemertry data that are collected from the SAP System

1.8 Azure Storage Account

An Azure storage account contains all our Azure Storage data objects: blobs, file shares, queues, tables, and disks. The storage account provides a unique namespace for our Azure Storage data that’s accessible from anywhere in the world over HTTP or HTTPS. Data in our storage account is durable and highly available, secure, and massively scalable. Here we are using Container and Tables service from Storage account where we will store custom script and configuration settings respectively.

1.9 Automation Runbook

Azure Automation is a service in Azure that allows us to automate our Azure management tasks and to orchestrate actions across external systems from right within Azure. In Automation they are a container for all our runbooks, runbook executions (jobs), and the assets that our runbooks depend on. Here we will be using Automation Runbook for running jobs for running PowerShell scripts for Azure Operations.

1.10 Azure Logic Apps

Azure Logic Apps is a cloud-based platform for creating and running automated workflows that integrate our apps, data, services, and systems. With this platform, we can quickly develop highly scalable integration solutions for our enterprise and business-to-business (B2B) scenarios. As a member of Azure Integration Services, Logic Apps simplifies the way that we connect legacy, modern, and cutting-edge systems across cloud, on premises, and hybrid environments. Here we will be using Logic Apps to connect SAP system for collection of telemetry data as well as change in the Logon groups.

1.11 Integration Service Environment

An integration service environment is a fully isolated and dedicated environment for all enterprise-scale integration needs. When we create a new integration service environment, it’s injected into our Azure Virtual Network allowing you to deploy Logic Apps as a service in our VNET. We will be using the ISE for logic apps to access SAP system over private network.

1.12 SAP Connector for Microsoft .NET

SAP Connector for Microsoft .NET 3.0 (NCo 3.0) allows developers to use BAPIs and remote-enabled function modules in any .NET application (inside-out). We can also access .NET components from any ABAP application by implementing an RFC server in .NET (outside-in). Here we will be installing this Connector on Gateway server so that Logic app can connect SAP System using this Gateway.

2. ARCHITECTURE OVERVIEW

The following diagram explains about the services which are integrated to achieve the Autoscaling of SAP System on Azure Infrastructure.

2.1 Auto Scaling Out of SAP Systems

Below is the architectural diagram for the Auto-Scaling out of SAP systems, in which an overview of operational flow is elaborated: –

  • A Logic app deployed in ISE is scheduled to run periodically with a recurrence trigger to pull the data from SAP based on time filter using ODATA Service URL and push it into Log analytics workspace
  • Using Azure Monitor a log alert has been created which queries the Custom log table from Log Analytics Workspace with the SAP performance data and triggers an alert when specific threshold is breached.
  • This alert triggers the Azure Automation PowerShell runbook. The runbook checks the Scaling Configuration table for current app server counts is equal to max app server count. If yes it exits without performing any action.
  • If not, it triggers an ARM template to create a new application server based on Custom VM Image.
  • After successful creation of app server, the runbook calls Logic app to register new application server in SAP Logon and Server groups, as per the information from the Configuration Table.
  • At the end, the PowerShell RunBook updates the Current app server count in the Scaling Config table for next execution

2.2 Auto Scaling Down of SAP Systems

Below is the architectural diagram for the Auto-Scaling down of SAP systems, in which an overview of operational flow is elaborated: –

  • A Logic app deployed in ISE is scheduled to run periodically with a recurrence trigger to pull the data from SAP based on time filter using ODATA Service URL and push it into Log analytics workspace
  • Using Azure Monitor a log alert has been created which queries the Custom log table from Log Analytics Workspace with the SAP performance data and triggers an alert when specific threshold is breached.
  • This alert triggers the Azure Automation PowerShell runbook. The runbook checks the Scaling Configuration table for current app server counts is equal to min app server count. If yes it exits without performing any action.
  • If not the runbook calls Logic app to un-register the Additional Application Servers from Logon and Server groups, as per the information from the configuration table.
  • The runbook then schedules a second runbook based on Delay timeout defined in the configuration table. This will allow existing user sessions/jobs to be drained from the application server before its Stopped and proceed for deletion.
  • The second runbook issues a Soft Shutdown Command on the application server with timeout parameter which is being fetched from configuration table.
  • Once the SAP application server is successfully stopped, the respective Azure resources such as VM, NIC, disks etc… are deleted from the same execution.
  • At the end the PowerShell runbook updates the Current app server count in the Scaling Configuration table for the next execution.

3. PREREQUISITES

For taking advantage of the above architecture below are some of the pre-requisite

3.1 Authorization

Configuring user must have access for the below: –

  • Read Performance Data from SAP System
  • Get the secrets from Azure Key Vault
  • Creation of Managed Identity, Azure Storage Account, Log Analytics Workspace, ISE for Logic Apps, Azure Monitor, Data Gateways.
  • Creation and Using ARM templates to deploy VM
  • Read and Write access to Storage Account
  • Full access on Azure Automation and Run Books

3.2 Prepare SAP for Monitoring

3.2.1 Start Snapshot Monitoring

To accurately measure the load on an SAP application server, SAP specific performance metrics like work process utilization, user sessions, SAP application memory usage etc. are required. SAP provides a snapshot monitoring utility SMON (or /SDF/MON) which collects this information and stores it in a transparent table (header table) within the SAP database.

Goto Tcode /sdf/mon and click on Schedule Daily Monitoring

Fill out all the required settings that we need for the monitoring records and click Execute

Below screen will appear when the schedule has been saved with the required settings which we have selected

We can also double click on the saved entry to check if the data gathering of Monitoring stats are happening as per the expectation

Now, our performance data is getting collected from the SMON daily schedulers

3.2.2 SAP Development Package

We also need to make data available for 3rd party tools, we need to make some changes to the SAP Objects so that this data can be read by the external Azure services with proper authentication. To do the same we need to create a custom Development Package to store the changes. If we already have the Development Package then we can skip this step.

Goto Tcode SE80 and Select Package from DropDown Menu and Search for the Package, pop will came to create the object Click Yes

Specify all the details that is required to create the Package as below and then click on Tick

After successful creation of Package, we can see the below screen: –

Now, our package is ready to hold the custom objects used for Auto-scaling telemetry data.

3.2.3 SAP Gateway Service Builder

Since we have all the data stored inside the SAP system, hence we need to create a Gateway service by which Azure Service can be allowed to enter the SAP system to read that telemetry data with proper authentication.

Goto Tcode SEGW and then click on Create

Specify all the requested details about the Gateway service and then click on Tick

Once the Gateway Service has been created then below screen with success message will appear

Then as we need to import the structure from the DDIC object hence right click on Data Model → Import and then click on DDIC Structure

Specify the name of the entity and the name of the ABAP Structure for which we need to build the Data Model and click on Next

Select the fields of select ABAP Structure, which is required for evaluating the data, here we are choosing the parameters according to the Auto-scaling configuration and click on Next

Specify the fields which needs to be kept as Key Fields and click on Next

Once the Entity type has been imported successfully to the Data Model then below screen will appear

Now we need to go to the edit mode of the Gateway Service Builder and click on Generate button to generate all the relevant objects

Specify the names of the objects that gets generated during this process and then click on Tick

After the successful generation of the objects, we can see them as below with Green Status

Since we want this service to get all the telemetry data that is collected by the SMON jobs, we need to change some ABAP codes to achieve the same. We need to change the code of GET_ENTITYSET Method to

data: lv_osql_where_clause type string.
lv_osql_where_clause = io_tech_request_context->get_osql_where_clause( ).
select * from /sdf/mon_header
      into corresponding fields of table @et_entityset
      where (lv_osql_where_clause).

3.2.4 Activate and Maintain Services

Since our Gateway Service is ready to access the telemetry data, hence we need to configure and activate the Services which allows external sources to Gateway Service. For doing the same goto t-code /IWFND/MAINT_SERVICE and click on Add Service

Search for the Backend Services by below criteria: –

This will show the Gateway Service which we have built earlier, then click on the service: –

Specify and cross check all the details popped up in next screen and then click on Tick

Below message will pop up once the service is created successfully: –

We can also see the same newly created service to the list: –

Now, our Service is available for the external tools to read the monitoring data from the SAP system.

3.2.5 Verify with SAP Gateway Client

Since all the settings are in place, hence we can use the SAP Gateway Client to verify the configuration of accessing the telemetry data. For doing the same, we can go the tcode /IWFND/GW_CLIENT and try to GET the response using the Request URI of the Gateway service as below:-

Next Part: Auto Scaling of SAP Systems on Azure – Part II

Rating: 0 / 5 (0 votes)