SAP To Use AWS's AI Chips and Integrate GenAI Model from Amazon Bedrock

Amazon Web Services (AWS) and SAP SE have announced an expanded strategic collaboration to enhance cloud enterprise resource planning (ERP) experiences and enable enterprises to harness generative artificial intelligence (AI) for new capabilities and efficiencies. This partnership aims to transform how businesses operate by integrating generative AI into their core processes.

SAP plans to utilize AWS's specialized chips for training and deploying its future Business AI offerings. The chips in question are AWS Trainium and AWS Inferentia, which are purpose-built for artificial intelligence (AI) and machine learning (ML) workloads. This move is part of the broader strategic collaboration between AWS and SAP to integrate generative AI into SAP's cloud enterprise resource planning (ERP) experiences.

AWS Trainium and AWS Inferentia are custom-built chips designed by Amazon Web Services (AWS) for specific machine learning (ML) tasks. AWS Trainium is optimized for deep learning (DL) training of large models, including generative Al models.

Each Trainium accelerator includes two second-generation NeuronCores and 32 GB of high- bandwidth memory, delivering up to 190 TFLOPS of FP16/BF16 compute power, ideal for training tasks in natural language processing, computer vision, and recommendation systems. While Inferentia2 accelerators have 32 GB of HBM per accelerator, significantly increasing memory capacity and bandwidth.

By leveraging AWS's powerful hardware, SAP aims to enhance the performance and efficiency of its AI-driven business applications. This will enable SAP customers to harness the capabilities of generative AI for a variety of applications, streamlining processes and driving innovation within their operations.

In addition, the collaboration also includes the integration of generative AI models from Amazon Bedrock, such as the Anthropic Claude 3 model family and Amazon Titan, into the SAP AI Core infrastructure. This will provide SAP customers with access to high-performing large language models (LLMs) and other foundation models (FMs) that can be customized with their own data.

Amazon Bedrock is a fully managed service provided by Amazon Web Services (AWS) that enables developers to build and scale generative Al applications using foundation models (FMs).

The goal is to make it easier for customers to adopt the RISE with SAP solution on AWS, improve the performance of SAP workloads in the cloud, and embed generative AI across an enterprise's portfolio of business-critical applications. This initiative is expected to accelerate the adoption of generative AI and modernize key business processes built on SAP solutions.

The generative AI hub in SAP AI Core infrastructure provides customers with secure access to a broad range of large language models (LLMs) that can easily be integrated into SAP business applications. Tens of thousands of customers use Amazon Bedrock to easily, quickly and securely build and scale generative AI applications using FMs from leading AI companies such as AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI and Amazon.
Advertisements

Post a Comment

Comment

Previous Post Next Post