Harness Amazon Bedrock Agents to Manage SAP Instances

1 week ago 8

Introduction

In this blog, we demonstrate how Amazon Bedrock Agents can be used to assist in the administration of SAP instances. By leveraging the agent’s ability to call the web services provided by SAPControl, users can efficiently manage their SAP environments.

SAPControl is a SOAP web service interface that is primarily used for stopping, starting, and monitoring SAP processes. However, beyond these common use cases, the service offers a wide range of functionalities that can be very valuable in administrative scenarios.

Prerequisites for this blog:

  • The user is an experienced SAP administrator or technical architect with familiarity in using SAPControl and managing an SAP system.
  • Access to a healthy and running AWS-hosted SAP instance that has the sapstartsrv process running.
  • AWS CLI is installed and configured on the SAP server.
  • The EC2 security group for the SAP server allows access on port 5<nn>13, where <nn> is the instance number of the SAP instance.
  • The EC2 instance role has access to an S3 bucket.
  • Access to the Amazon Bedrock service and a Large Language Model (e.g., Claude Haiku).
  • Knowledge of and access to the AWS Lambda service.
  • A working python development environment
  • Python development skills, use of virtual environments, layers, etc.

Process Flow

An AWS Lambda function (hereinafter referred to as “Lambda”) is created to serve as a web proxy that calls the SAPControl SOAP web service interface on the SAP server. The Lambda function uses the <sid>adm user and its password to authenticate. For enhanced security, the password is retrieved from an encrypted parameter store in AWS Systems Manager, rather than being hardcoded in the function.

The sample function provided in this blog will enable the following tasks:

  • Check the value of an SAP parameter
  • Send log files to a designated S3 bucket
  • Stop an SAP instance
  • Start an SAP instance
  • Check the process status of an SAP instance

The Lambda function is then integrated with an Amazon Bedrock Agent powered by a Large Language Model (hereinafter referred to as “LLM”) of choice. This allows SAP administrators to interact with and operate their SAP systems using natural language.

Figure 01 - Architecture overview of the solution

Figure 01 – Architecture overview of the solution

High Level Steps

  1. Create a Lambda layer in Python that contains libraries that the Lambda function is using
  2. Create a Lambda function that acts as a Web proxy
  3. Create a Bedrock Agent and assign the Lambda function we created
  4. Create new parameters in Systems Manager to store the <sid>adm password and instance number

Please note that the solution presented in this blog allows you to alter the status of an SAP instance. It is strongly recommended to practice on a non-production, sandbox environment before attempting this on a business-critical SAP system. We cannot take responsibility if a mission-critical SAP system is impacted by following the steps outlined in this blog.

Detailed Steps

Creating the Lambda Layer
In your existing Python development environment, install the required packages based on the requirements.txt file. In this blog, we used Python 3.11 and the compatible packages. Please note that, depending on your exact setup and versions, you may need to change the requirements.txt file. Once the packages are installed, zip the entire python directory, including the subdirectories and contents of /python/lib/python3.11/site-packages. This will be the zipped layer file for the Lambda function. We highly recommend using pyenv and virtual environments to install the required packages in order to create the layer.
If you require more information on how to install packages using the requirements.txt file, or how to work with Python layers, please check the Appendix at the end of this blog. You may also ask for help from Python developers in your organization. This blog focuses on the Amazon Bedrock and SAP-related steps, so a detailed guide into Python is out of the scope.

Once the layer zip file is created, login to the AWS Account and go to Lambda and load the layer (see Figure 02).

Figure 02 - Load Lambda Layer

Figure 02 – Load Lambda Layer

The zipped layer file contains the necessary python libraries for zeep and boto3. In our example the layer file is called “sapcontrollayer311.zip” and we used x86_64 processor architecture – in your exact setup this may be different, so you may deviate slightly from the below screenshots to accommodate your exact setup (see Figure 03).

Figure 03 - Lambda Layer Configuration

Figure 03 – Lambda Layer Configuration

Make sure to set the architecture and runtime to your needs (in our example it is x86_64 and Python3.11 compatibility). The layer is now uploaded. (see Figure 04)

Figure 04 - Lambda Layer Created

Figure 04 – Lambda Layer Created

Create a new Lambda function, called “SAPControlBedrockAgentLambda” – see Figure 05

Figure 05 - Create New Lambda Function

Figure 05 – Create New Lambda Function

Give the function a name, for example: “SAPcontrolBedrockAgentLambda”. Set “Runtime” for Python 3.11 and “Architecture” to x86_64 (or whichever setup you are using in your example) – See Figure 06

Figure 06 - Lambda Configuration

Figure 06 – Lambda Configuration

Select to “Create a new role with basic Lambda permissions” for now. Later we will revisit the role. See Figure 07.

Figure 07 - Lambda Configuration - continued

Figure 07 – Lambda Configuration – continued

Also, set the “Enable VPC” under “Advanced settings”. Then select your VPC and subnet(s) where your SAP instances are hosted. Figure 08 shows an example setup.

Figure 08 - Lambda function configuration - cont'd

Figure 08 – Lambda function configuration – continued

Then, copy the following code into the Lambda function. Click on “Deploy” otherwise the Lambda code would not commit.

When the code deployed, attach the previously uploaded layer to the function. (see Figure 09)

Figure 09 - Adding a Layer to Lambda

Figure 09 – Adding a Layer to Lambda

Then, select “Custom layers” and pick the layer from the dropdown. (see Figure 10)

Figure 10 - Select Custom Layer

Figure 10 – Select Custom Layer

Since LLM parsing and responses may take slightly longer time, change the timeout settings for the Lambda function to 1 min (from default 3 seconds) to avoid the function cutting out middle of processing. Figure 11 shows how to make that change.

Figure 11 - Change Lambda Time Out to 1 minute

Figure 11 – Change Lambda Time Out to 1 minute

Check the Lambda Execution role, open it for editing, and add a policy that allows access to AWS Systems Manager Parameter Store. Figure 12 shows how to locate the role.

Figure 12 - Lambda Execution Role

Figure 12 – Lambda Execution Role

The below (Figure 13) is a sample inline policy to allow access to SSM parameters.

Figure 13 - Sample Inline Policy

Figure 13 – Sample Inline Policy

Make sure, that an LLM is enabled to use
In the Amazon Bedrock service, under “Model Access” enable the “Claude 3 Haiku” model (refer to Figure 14)

Figure 14 - Enable an LLM

Figure 14 – Enable an LLM

Create an Amazon Bedrock Agent and associate it with the Lambda function we’ve just deployed.

Figure 15 - Create Bedrock Agent

Figure 15 – Create Bedrock Agent

Add the following details to the agent (Figure 16)

Figure 16 - Bedrock Agent Details

Figure 16 – Bedrock Agent Details

Use the “Claude 3 Haiku” model (or whichever model you enabled in the previous step). Please note that using any LLM’s are subject to fees, calculated on the basis of so-called tokens. For more information about pricing please visit the following AWS official page.

The example scenarios at the end of this blog would cost a total of around ~$0.12 in case of the use of Claude Haiku 3 LLM. There could be some variance depending on the exact region you run this and the amount of words (tokens) you use as input and output.

Put the following text as Instructions for the Agent.

“You are a helpful agent who can help SAP system administrators with various tasks, including check profile parameters, stop and start SAP systems, load SAP system logfiles to S3 bucket, check the process list and statuses of an SAP system”.

The wording is very important, because this text will guide Amazon Bedrock whether it should be using the agent for a certain task.

Add new Action groups
To add a new Action group, click on “Add”, and call the Action Group “ag-sapcontrolagent”.

Figure 17 - Add a new Action group

Figure 17 – Add a new Action group

Under the “Action group invocation” section, click on “Select an existing Lambda function” radio button and choose the name of the Lambda function, that was created earlier.

Figure 18 - Associate the Lambda function to the Action group

Figure 18 – Associate the Lambda function to the Action group

Create the Action group functions
Create the first function, called “get-parameter-value”.
Use exact same name of the functions and the parameters because they will have to match what’s in the Lambda code.

Figure 19 - Add function get-parameter-value

Figure 19 – Add function get-parameter-value

Instead of filling in the details manually, you may just copy-paste the get-parameter-value JSON, once you switched to the “JSON Editor” view from the standard Table view (top right of the screen).

Then add more functions, by clicking on the button “Add action group function”:

Name the next Action group function as “load-logfiles-to-s3”:

Figure 20 - Add function load-logfiles-to-s3

Figure 20 – Add function load-logfiles-to-s3

Again, you may just copy-paste the load-logfiles-to-s3 JSON, once you switched to the JSON editor view.
Let’s create another function to stop an instance, called “stopinstance”.

Figure 21 - Add function stopinstance

Figure 21 – Add function stopinstance

Or just copy-paste the stopinstance JSON.

Next create the start function, called “startinstance”.

Figure 22 - Add function startinstance

Figure 22 – Add function startinstance

You may copy-paste the startinstance JSON from here, if easier.

Finally, add the function “get-process-status”, so we can verify the instance status through it.

Figure 23 - Add function get-process-status

Figure 23 – Add function get-process-status

JSON format for get-process-status function is also available.

Then click on “Create”. If the “Instruction for the Agent” field got erased, fill it out again (it does happen sometimes). Then click on “Save and Exit”.

Click on the “Prepare” button to ready up the test version of the Agent.

Figure 24 - Prepare the Agent

Figure 24 – Prepare the Agent

On this same screen make a note of the Agent ARN. You will need it in a minute.

Revise Lambda Permissions
Navigate back to the Lambda function. Under “Configuration” tab in the “Permissions” pane, add the following “Resource-based policy statements”(click on “Add permissions” button)

Figure 25 - Modifying the Lambda permissions

Figure 25 – Modifying the Lambda permissions

Add the Amazon Bedrock Agent to be able to invoke the Lambda function.

Figure 26 - Add Amazon Bedrock Agent to be able to invoke Lambda function

Figure 26 – Add Amazon Bedrock Agent to be able to invoke Lambda function

Select “AWS service”
Service: “Other”
Statement ID: free text, e.g. “sapcontrolbedrockagent”
Principal: “bedrock.amazonaws.com”
Source ARN: you can check back on your Amazon Bedrock Agent for the ARN (see a few screens above)
Action: “lambda:InvokeFunction”

This policy allows the Amazon Bedrock Agent to invoke the Lambda function.

Maintain System Manager Parameter Store
One last thing to maintain is some information for the code to lookup <sid>adm passwords and instance numbers. It’s not a good practice to hardcode passwords into the code. For this, create the following two entries within the AWS Systems Manager -> Parameter Store

Figure 27 - Parameter Store in AWS Systems Manager

Figure 27 – Parameter Store in AWS Systems Manager

Add a SecureString parameter to store the password for <sid>adm, so in the below example SAP SID is MLD hence the ‘mldadm’ entry.

Figure 28 - Add SecureString Parameter

Figure 28 – Add SecureString Parameter

Add an instance number parameter too, using the

<sid>no pattern, so in case of MLD system the entry is “mldno” – this can be transparent String type.

Figure 29 - SAP system number parameter entry

Figure 29 – SAP system number parameter entry

Please make sure that only necessary people and services have access to the Parameter Store.

Test the Agent

It is finally time to test conversation with the Agent. Navigate back to the Bedrock Agent service, choose your newly created agent and use the prompt area to run the following tests (figure 30 below shows the prompt area in yellow)

Figure 30 - Interact with the Agent

Figure 30 – Interact with the Agent

Ask the agent of a parameter value

Ask this from the agent:
“What is the parameter value of SAPDBHOST for MLD SAP system on host 192.168.0.12?”

Answer from the Bedrock Agent:
“The parameter value of SAPDBHOST for the MLD SAP system on host 192.168.0.12 is ‘sapvhana’.”

Figure 31 - Example of Conversation with the Agent

Figure 31 – Example of conversation with the Agent

Ask for more parameter values in the same prompt

Question:
“What is the parameter value of rdisp/wp_no_dia and SAPLOCALHOST for MLD SAP system on host 192.168.0.12?”

Answer:
“The parameter values for the MLD SAP system on host 192.168.0.12 are: – rdisp/wp_no_dia = 10 – SAPLOCALHOST = sapmldpas”

Notice that the agent understood that you are asking the values of two separate parameters and it invokes the function twice.

Check the status of the SAP instance

Ask the following:
“What is the status of the SAP system MLD on host 192.168.0.12?”

Answer should be something like:
“The status of the SAP system MLD on host 192.168.0.12 is that all the key processes like dispatcher, IGS watchdog, gateway, and ICM are running and in a healthy state.”

Figure 32 - Another example screen of conversation with the Agent

Figure 32 – Another example of conversation with the Agent

Then stop the instance
! Please bear in mind that the next example will actually shut down the SAP instance. Follow this step only if your system can be safely shut down without business impact !

Input:
“Stop SAP system MLD on host 192.168.0.12”

Output:
“The SAP system MLD on host 192.168.0.12 has been stopped successfully.”

Check the status again
See prompts below in Figure 33 to check status again.

Figure 33 - Stopped system status detected by the Agent

Figure 33 – Stopped system status detected by the Agent

Then, start the system again
Please note that you do not need to provide the host information to the agent anymore. The agent remembers the host details (192.168.0.12) from the previous conversations. (see Figure 34)

Input prompt:

“start SAP system MLD”

Figure 34 - Start SAP system by the Agent

Figure 34 – Start SAP system by the Agent

Check status
After waiting a few minutes to allow the SAP instance to start, ask the agent about the status. See Figure 35 below.

Figure 35 - Status check of SAP instance

Figure 35 – Status check of SAP instance

Using Amazon Bedrock Knowledge Base

In this next example, we use Amazon Bedrock Knowledge Base to store logfiles and we search for errors in the logs. Let’s create a temporary empty S3 bucket. In this example, we’ll use the bucket name “bedrockdemosaplogs”, but you can create any bucket name as long as it’s not used yet (or simply use an existing bucket if there is one for this purpose). We will have the agent upload logfiles into this bucket, which will then be used as an Amazon Bedrock Knowledge Base (hereon referred to as “KB”). Ensure that the EC2 instance role has the necessary permissions to access the S3 bucket.

To better demonstrate the power of the KB and simulate a real-life situation, we will deliberately cause an error to occur so that we have some errors in the log files.

In this example, we will shut down the message server by stopping the cluster behind it, without shutting down the application servers.

Figure 36 - Causing a test case of error

Figure 36 – Causing a test case of error

Then tell the agent to load the logfiles to the bucket (make sure to use the s3://<bucketname>/ format). See Figure 37 below

Figure 37- Tell the Agent to load the logfiles to the bucket

Figure 37- Tell the Agent to load the logfiles to the bucket

The logfiles are now, indeed, showing up in the bucket. See Figure 38 below.

Figure 38 - Logfiles are shown in the bucket

Figure 38 – Logfiles are shown in the bucket

Then go to Amazon Bedrock -> Knowledge bases -> Create knowledge base.

Figure 39 - Create the Knowledge Base

Figure 39 – Create the Knowledge Base

Give a name and select S3 as the source of data.

Figure 40 - Add S3 as the data source

Figure 40 – Add S3 as the data source

Provide the S3 bucket URI information.

Figure 41 - Provide the S3 URI

Figure 41 – Provide the S3 URI

Choose the “Titan Text Embeddings v2” model for Vector store.

Figure 42 - Choose the vector store

Figure 42 – Choose the vector store

And click on “Create knowledge base” button at the last summary screen. It will create the KB.

Once the KB is created, run the Sync job.

Figure 43 - Knowledge Base Sync

Figure 43 – Knowledge Base Sync

Monitor the sync process.

Figure 44 - Sync process may take long time

Figure 44 – Sync process may take long time

Sync process may take longer time depending on the amount of logs.

Then go back to the Agent and edit the configuration. Add the KB that we have just created and synced.

Figure 45 - Add KB to the Agent

Figure 45 – Add KB to the Agent

Select the Knowledge Base to add to the Agent.

Instructions for the agent should be as follows:

“Search for errors in logfiles. Ignore .old files. Try to locate the most recent error message of the same kind.” (See Figure 46 below)

Figure 46 - Select the KB

Figure 46 – Select the KB

Save the agent and Prepare it again. (Do not forget to prepare otherwise it won’t pick up the addition of the knowledge base).

Test the Agent with Knowledge Base

Ask the agent if it detects any errors from the logs. An example prompt is below on Figure 47.

Figure 47 - Agent analyzes the logs in Knowledge Base

Figure 47 – Agent analyzes the logs in Knowledge Base

The Agent should report the error message found in the log files.

Try it yourself

  • Try to analyze the Bedrock agent traces and understand how it breaks down the conversation to actions.
  • Take a function of SAPcontrol that is not yet in the sample code above and try to implement it (adding the code for it in to Lambda, creating an action group, etc). It doesn’t require lot of coding skills, copy and paste the relevant section and modify it to the new function. Don’t forget to update the agent description with the additional task.
  • Instead of keeping information in Systems Manager Parameter Store, try to load your entire landscape information as a CSV file to an S3 bucket and add it to your agent as a knowledge base
  • Combine parameter checks with cross-checking of SAP EarlyWatch Alert reports (added as Knowledge Bases) to determine if a certain parameter value is deviated from recommendations.

Costs

  • The existing SAP system does incur costs, such as EC2, storage, etc – This is variable depending on your exact SAP system.
  • LLM pricing is based on tokens (including both input prompts and output texts). Hence, this also varies. For latest pricing check AWS official page.

Summary

In this blog post, we demonstrated a use case on how to leverage Amazon Bedrock Agents to assist in executing basic SAP operational tasks, such as starting and stopping an SAP instance, or checking health status and parameter values. We also utilized the Knowledge Base to help locate relevant log entries within the vast amount of log files.

These examples are just a sample of the potential use cases that can be implemented using a similar approach. There is a plethora of additional use cases that can be explored and implemented on the same analogy.

Learn more

To read more about Amazon Bedrock service visit this page.

To learn more about how to build Lambda function with Python, check this site.

In the this AWS documentation, you can find more guidance how to work with layers for Python. There are some more useful tips here and here as well.

Read Entire Article