Pegasystems expands Pega GenAI to AWS and Google cloud’s LLM

Pegasystems Inc, a workflow automation platform provider, has expanded its Pega GenAI to connect to Amazon Web Services (AWS) and Google Cloud’s Large Language Models (LLMs). 

The expansion will provide clients to connect with generative AI services and models with Pega GenAI architecture to support generative AI models within Pega solutions. These will include solutions from AWS, including Amazon Bedrock and Amazon Titan; from Google Cloud, including Vertex AI and Google Gemini; and Claude from Anthropic. 

AWS and Google Cloud generative AI models will be available in Pega Connect GenAI, a plug and play architecture that enables low-code developers to author prompts and get value from generative AI in any workflow or decision according to the company. 

 “Clients know the best model for them will depend on a variety of factors, including their own strategy and infrastructure, effectiveness, performance, speed, trust, and cost, so having choice is key. The extension of these relationships underline Pega’s commitment to becoming the workflow backbone for generative AI solutions to enable truly transformational change for our clients. Our trusted partners play an important role in helping us to deliver these outcomes,” said Don Schuerman, chief technology officer, Pega.

This enables Pega low-code developers to build custom generative AI-powered capabilities into their workflows. Pega GenAI can create a tool that summarizes documents, giving users an overview of information as soon as they open their tasks.

(With inputs from bl intern Meghna Barik)

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *