SAP Integration Strategy, SAP HANA

AWS Serverless Lambda Functions to integrate S3 Buckets with SAP S/4 HANA using OData APIs

Introduction

This blog shows to integrate AWS S3 Buckets, AWS Lambda Functions and SAP S/4 HANA OData APIs.

For those unfamiliar with AWS S3 and Lambda functions, here are descriptions from the AWS websites:

AWS (Amazon Web Services) Lambda is a serverless, event-driven service that allows you to execute any type of application logic dynamically without the need of dedicated servers. Lambda functions can be triggered from most AWS services and you only pay for what you use.

AWS (Amazon Web Services) Simple Storage Service (Amazon S3) is one of the leading cloud based object storage solutions that can be used for all data sizes from databases to datalakes and IoT. The scalability, reliability, simplicity and flexibility with full API functionality is state-of-the-start.

Architecture

Here is an architecture diagram for the prototype. Normally SAP S/4 HANA system would be behind a firewall and there would be another layer to safely expose the services. For the purposes of simplifying the prototype, we want to keep the scenario focused on AWS S3, AWS Lambda and the SAP S/4 HANA OData API call.

AWS S3 Bucket triggering Lambda function to SAP OData API call

Background

I was having an architecture discussion last week with Jesse, a good friend mine whom I used to work together at SAP Labs in Palo Alto. One of the things I really enjoyed over the years is talking to him about the new technologies and how to leverage them in the context of SAP. We have implemented many cutting-edge SAP integration projects and witnessed the evolution of SAP integration starting from the lowest level of C, C++, Java, and Microsoft COM/.Net components in the RFC SDK, Business Connector, BizTalk Integration, XI, PI, PO, CPI to the latest SAP BTP Cloud Integration with the SAP API Business Hub and SAP API Portal.

My friend mentioned an interesting scenario where he built serverless lambda functions on AWS that could be triggered on a scheduled basis to run the logic which in his case was to check for the prices of a certain item on a website and trigger an alert if the prices reached a certain threshold – similar to the Kayak application which checks for prices across Expedia, Travelocity, etc. No need for dedicated server….No need to build or maintain a server… Just the logic code run on demand… What a powerful and amazing concept!

I immediately started thinking of all of the possibilities and how it could be used for an SAP focused integration scenario. Let’s drop a sales order file into an AWS S3 bucket and have it immediately trigger an AWS Lambda function written in Node.js that invokes an SAP sales order OData RESTful service and have the response dropped into another AWS S3 Bucket. You can picture this as the evolution of the traditional integration scenario whereby a file is dropped on an SFTP server and then a middleware server regularly polls this folder for new files and then calls the backend SAP system through a BAPI or custom RFC or the modern OData approach. We get away from the antiquated SFTP servers and use the more versatile, flexible, and powerful S3 bucket technology offering. We get away from the older SAP BAPIs (BAPI_SALEORDER_CREATEFROMDAT2) and move to the latest SAP OData API. Evolution…..

Getting started with the fully activated SAP S/4 HANA 2021 appliance

So I convinced Roland, another SAP colleague whom I used to work with at SAP Labs in Palo Alto, to get this scenario up and running. We decided to spin up a new trial SAP S/4 HANA 2021 full appliance on the AWS Cloud through the SAP Cloud Appliance Library and start the adventure.

SAP S/4 HANA 2021 FPS02 Fully Activate Appliance on SAP CAL

SAP Cloud Appliance Library

If you have access to an SAP S/4 HANA system that you can call (through SAP BTP Cloud integration, SAP API Hub, or a reverse proxy that exposes your cloud or on premise SAP S/4 HANA system) from the internet, then no need to spin up an appliance. If you do not have an SAP S/4 HANA system, then definitely spin one up and follow this blog to implement the integration scenario. It is easy to spin up a test SAP S/4 HANA fully activated appliance and it only takes a matter of minutes. Getting an SAP system up and running to do prototyping would normally take weeks of planning the hardware, purchasing the SAP software, installing, patching, and configuring. Now it takes minutes with the deployment capabilities through AWS, Azure and Google. The new images on SAP’s cloud appliance library are really impressive – the SAP Fiori launchpad runs correctly right away and as for the remote desktop machine, even the Eclipse installation can be triggered through a few mouse clicks which will load the SAP ADT (ABAP Development Tools). All of the SAP OData services and SAP Fiori apps are enabled by default which is very helpful! I have deployed many of these test appliances in the past since this capability was introduced but it still used to take time to get it set up to start working on a prototype development. Not anymore. Here you can see the SAP S/4 HANA instance and the remote desktop instance running. I shut the SAP Business Objects BI Platform and SAP NetWeaver instances off since they are not needed for this demo,

SAP S/4 HANA instances running on AWS Cloud

SAP Sales Order JSON Request and SAP OData API Service

Here is the SAP sales order request JSON that we drop into the AWS S3 bucket. Note that this JSON request works for invoking the standard SAP sales order service API_SALESORDER_SRV to create a sales order in the fully activated appliance. For this project, we do not want to add any complexity by adding mapping requirements. In most common integration scenarios, there may be mapping required from other formats such as cXML, OAG XML, etc. Also, the SAP S/4 HANA system is not exposed directly but through SAP BTP Cloud Integration or SAP API Hub which adds a layer of security control over the interfaces. For now we just want to expose the API available through HTTPS to the lambda function.

{
   "SalesOrderType":"OR",
   "SalesOrganization":"1010",
   "DistributionChannel":"10",
   "OrganizationDivision":"00",
   "SalesGroup":"",
   "SalesOffice":"",
   "SalesDistrict":"",
   "SoldToParty":"10100011",
   "PurchaseOrderByCustomer":"File dropped into S3 Bucket",
   "to_Item":[
      {
         "SalesOrderItem":"10",
         "Material":"TG11",
         "RequestedQuantity":"1"
      },
      {
         "SalesOrderItem":"20",
         "Material":"TG12",
         "RequestedQuantity":"5"
      }
   ]
}

To verify that the SAP sales order service is running, run transaction code /IWFND/MAINT_SERVICE and check to see that the API_SALES_ORDER_SRV (Sales Order A2X) highlighted below is running. If it is not active, make sure to activate it through this transaction code.

SAP Sales order service – API_SALES_ORDER_SRV

AWS S3 Bucket for uploading the sales order request file

Here is our ltc-inbound S3 bucket where we will upload the file to:

S3 Bucket ltc-inbound

When we upload the file, this automatically triggers the lambda function. We will show how to do this later in the document:

File uploaded to the S3 Bucket

AWS CloudWatch Monitoring View

Once the Lambda function is triggered, you can see the log in AWS CloudWatch monitoring tool:

AWS CloudWatch Lambda function execution log

If you open up the log details, you can see that logging we have performed from our Node.js code – “writing sales order 4969 to S3 bucket ltc-outbound”.

Lambda function detailed logs on AWS CloudWatch

AWS S3 Bucket for writing the sales order response file

In our Lambda function, we save the sales order response file in the ltc-outbound S3 bucket. Note that we have parsed out the order number and named the file with it to make it easier to discern amongst other files.

Sales Order Response saved to AWS S3 Bucket

Sales Order response JSON file

Here is the response sales order JSON:

{
    "d":
    {
        "__metadata":
        {
            "id": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')",
            "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')",
            "type": "API_SALES_ORDER_SRV.A_SalesOrderType",
            "etag": "W/\"datetimeoffset'2022-10-24T06%3A05%3A25.7050130Z'\""
        },
        "SalesOrder": "4969",
        "SalesOrderType": "OR",
        "SalesOrganization": "1010",
        "DistributionChannel": "10",
        "OrganizationDivision": "00",
        "SalesGroup": "",
        "SalesOffice": "",
        "SalesDistrict": "",
        "SoldToParty": "10100011",
        "CreationDate": null,
        "CreatedByUser": "",
        "LastChangeDate": null,
        "SenderBusinessSystemName": "",
        "ExternalDocumentID": "",
        "LastChangeDateTime": "/Date(1666591525705+0000)/",
        "ExternalDocLastChangeDateTime": null,
        "PurchaseOrderByCustomer": "File dropped into S3 J1",
        "PurchaseOrderByShipToParty": "",
        "CustomerPurchaseOrderType": "",
        "CustomerPurchaseOrderDate": null,
        "SalesOrderDate": "/Date(1666569600000)/",
        "TotalNetAmount": "105.30",
        "OverallDeliveryStatus": "",
        "TotalBlockStatus": "",
        "OverallOrdReltdBillgStatus": "",
        "OverallSDDocReferenceStatus": "",
        "TransactionCurrency": "EUR",
        "SDDocumentReason": "",
        "PricingDate": "/Date(1666569600000)/",
        "PriceDetnExchangeRate": "1.00000",
        "RequestedDeliveryDate": "/Date(1666569600000)/",
        "ShippingCondition": "01",
        "CompleteDeliveryIsDefined": false,
        "ShippingType": "",
        "HeaderBillingBlockReason": "",
        "DeliveryBlockReason": "",
        "DeliveryDateTypeRule": "",
        "IncotermsClassification": "EXW",
        "IncotermsTransferLocation": "Walldorf",
        "IncotermsLocation1": "Walldorf",
        "IncotermsLocation2": "",
        "IncotermsVersion": "",
        "CustomerPriceGroup": "",
        "PriceListType": "",
        "CustomerPaymentTerms": "0001",
        "PaymentMethod": "",
        "FixedValueDate": null,
        "AssignmentReference": "",
        "ReferenceSDDocument": "",
        "ReferenceSDDocumentCategory": "",
        "AccountingDocExternalReference": "",
        "CustomerAccountAssignmentGroup": "01",
        "AccountingExchangeRate": "0.00000",
        "CustomerGroup": "01",
        "AdditionalCustomerGroup1": "",
        "AdditionalCustomerGroup2": "",
        "AdditionalCustomerGroup3": "",
        "AdditionalCustomerGroup4": "",
        "AdditionalCustomerGroup5": "",
        "SlsDocIsRlvtForProofOfDeliv": false,
        "CustomerTaxClassification1": "",
        "CustomerTaxClassification2": "",
        "CustomerTaxClassification3": "",
        "CustomerTaxClassification4": "",
        "CustomerTaxClassification5": "",
        "CustomerTaxClassification6": "",
        "CustomerTaxClassification7": "",
        "CustomerTaxClassification8": "",
        "CustomerTaxClassification9": "",
        "TaxDepartureCountry": "",
        "VATRegistrationCountry": "",
        "SalesOrderApprovalReason": "",
        "SalesDocApprovalStatus": "",
        "OverallSDProcessStatus": "",
        "TotalCreditCheckStatus": "",
        "OverallTotalDeliveryStatus": "",
        "OverallSDDocumentRejectionSts": "",
        "BillingDocumentDate": "/Date(1666569600000)/",
        "ContractAccount": "",
        "AdditionalValueDays": "0",
        "CustomerPurchaseOrderSuplmnt": "",
        "ServicesRenderedDate": null,
        "to_Item":
        {
            "results":
            []
        },
        "to_Partner":
        {
            "__deferred":
            {
                "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')/to_Partner"
            }
        },
        "to_PaymentPlanItemDetails":
        {
            "__deferred":
            {
                "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')/to_PaymentPlanItemDetails"
            }
        },
        "to_PricingElement":
        {
            "__deferred":
            {
                "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')/to_PricingElement"
            }
        },
        "to_RelatedObject":
        {
            "__deferred":
            {
                "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')/to_RelatedObject"
            }
        },
        "to_Text":
        {
            "__deferred":
            {
                "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')/to_Text"
            }
        }
    }
}

Sales Order response JSON file

Sales Order in SAP S/4 HANA and transaction codes

Here is the list of orders from the SAP VBAK sales order header table through the transaction code SE16n. Note, the order number 4969 captured in our AWS CloudWatch logs above.

Table entries in SAP table VBAK – sales order header table

We can bring up the sales order in SAP transaction code VA03:

Sales order transaction VA03

Here is the sales order – note the customer reference:

SAP Transaction VA03 – Sales order display

Lambda function details

So here are the details of the lambda function.

AWS Lambda Dashboard

Click on the button – Create the function:

Choose the – Use a blueprint option to get the sample code and choose the Get S3 Object:

Use the Blueprint to Get S3 Object
Blueprint Get S3 Object

Note that we also need to create a role – processOrderRole.- which we will need permissions to read the S3 bucket file content:

processOrder Lambda function configuration

Here is the S3 trigger that invokes the lambda function:

S3 trigger
Additional configuration for the lambda function

Here is the auto-generated Node.js code that is triggered. This code gets the event from which you can get the S3 bucket name and the S3 object key which you can then use to read the file contents.

Generated Lambda Node.js code

Here is our Node.js code which takes the input file and then makes the OData RESTful call to SAP sales order service endpoint. A challenging part we encountered was making the call to the SAP system which has a self-signed certificate. Also, note that since we are doing an HTTP Post to create the sales order, we need to pass in an x-csrf-token. We get the token from the metadata request using a Get which is the first call to the system before the post call.

Here is the lambda function node.js code. You can reuse this code:

// Author: Roland & Jay
// Note: This code requires the request and lodash npm modules
// Description: This code uses the AWS S3 events and objects and calls the SAP S/4 HANA sales order
// OData API service

console.log('Loading function')

const aws = require('aws-sdk')
const request = require('request')
const {get} = require('lodash')

const s3 = new aws.S3({ apiVersion: '2006-03-01' })


exports.handler = async (event, context) => {
  //console.log('Received event:', JSON.stringify(event, null, 2))

  // Get the object from the event and show its content type
  const bucket = event.Records[0].s3.bucket.name
  const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '))
  const params = {
    Bucket: bucket,
    Key: key,
  }
  try {
    const obj = await s3.getObject(params).promise()
    const { ContentType, Body } = obj
    const body = JSON.parse(Body.toString())
    await processOrder(body)
    return ContentType
  } catch (err) {
    console.log(err)
    const message = `Error getting object ${key} from bucket ${bucket}. Make sure they exist and your bucket is in the same region as this function.`
    console.log(message)
    throw new Error(message)
  }
}


const hostname = '**.**.***.*'
const port = *****
const interface = 'sap/opu/odata/sap/API_SALES_ORDER_SRV'
const auth = {
  user: '*********',
  pass: '**********',
}
const bucket = 'ltc-outbound'

const buildCallParameters = (url, request, method = 'GET', extraHeaders = {}, jar = request.jar(), token = 'fetch', body) => {
  console.log('build', method)
  const params = {
    url,
    jar,
    method,
    rejectUnauthorized: false,
    requestCert: true,
    agent: false,
    auth,
    headers: {
      'x-csrf-token': token,
      ...extraHeaders,
    },
  }
  return !body ? params : { ...params, body, json: true }
}

const httpCall = async (url, request, method = 'GET', extraHeaders = {}, jar, token, body) => {
  return new Promise((resolve, reject) => {
    const params = buildCallParameters(url, request, method, extraHeaders, jar, token, body)
    request(params,
      (error, response) => {
        if (error) {
          return reject(error)
        }
        return resolve(response)
      })
  })
}

const postDataToSAP = async function (json, metaDataUrl, postUrl) {
  const jar = request.jar()
  const tokenResp = await httpCall(metaDataUrl, request, 'GET', {}, jar)
  const token = tokenResp.headers['x-csrf-token']
  console.log('token: ', token)
  const postResp = await httpCall(
    postUrl,
    request,
    'POST',
    { 'Content-Type': 'application/json' },
    jar,
    token,
    json,
  )
  return postResp
}

const processOrder = async (order) => {
  console.log('starting')
  try {
    const { body } = await postDataToSAP(
      order,
      `https://${hostname}:${port}/${interface}/$metadata`,
      `https://${hostname}:${port}/${interface}/A_SalesOrder`,
    )
    console.log('success: ', body)
    const orderNum = get(body, 'd.SalesOrder', 'error')
    console.log(`writing sales order ${orderNum} to S3 bucket ${bucket}`)
    await putObjectToS3(bucket, `${orderNum}.json`, JSON.stringify(body))
  } catch (error) {
    console.log('error:', error, ' error')
  }
}


const putObjectToS3 = (bucket, key, data) => {
  const params = {
    Bucket: bucket,
    Key: key,
    Body: data
  }
  return new Promise((resolve, reject) => {
    s3.putObject(params, (err, data) => {
      if (err) {
        return reject(err)
      }
      resolve(data)
    })
  })
}

This code makes backend calls using the request npm module using await, async functions, and promises. There are two SAP backend calls that are made. The first one is get the metadata and the x-csrf-token. The next call which is the main call is to create the sales order. In order to run this code in Node.js, we need to load the npm libraries for request and lodash to the Lambda project. To do this, create a directory in your filesystem and then run the following two commands and take the node_modules folder, zip it up and upload it to the Lambda function;

npm install request

npm install lodash
Upload the node_modules zip file with request and lodash npm modules

Besides loading the code, you need to ensure that the lambda function has permissions to read and write the files to the S3 buckets. The read permissions where created when creating the Lambda function above.

Here is the role and policy to write out the file to the S3 bucket:

processOrderRole

We need to add a policy to the processOrderRole to allow writing to the S3 bucket:

Attach policies
Policy allow-sap-lambda-write-to-outbound
Policy allow-sap-lambda-write-to-outbound-policy
Rating: 0 / 5 (0 votes)