Unlocking customer relationship opportunities using GenAI and Amazon Bedrock - Impetus

Unlocking customer relationship opportunities using GenAI and Amazon Bedrock

May 2024

Innovation is the driving force behind transformative changes in the dynamic business landscape, and Generative AI (GenAI) stands out as a prime example of reshaping how organizations operate. Amazon Web Services (AWS) is leading this revolution through its groundbreaking service, Amazon Bedrock. This fully managed service provides access to cutting-edge foundation models through a unified API. Amazon SageMaker, another AWS-managed service, also simplifies the development and expansion of applications based on Generative AI.

This post explores how a GenAI-based solution on Amazon Bedrock developed by Impetus Technologies help transform the customer relationship by the enterprises in the thriving co-branded credit card market, which is expected to reach a market size of USD 1.4 trillion by 2025.

Use Case: Selection of best suited credit card

Businesses offer multiple co-branded credit cards with varying annual fees and benefits. Customers often struggle to identify the most suitable card for their needs. Impetus Technologies developed a Q&A-based application to address this challenge, allowing customers to ask questions and make informed decisions.

To enhance the application’s responses, a document containing information about the offered products was provided to the model. Prompt Engineering was employed, refining responses through different prompts and iterative model interactions. This approach proved to be a quicker, more cost-effective method for obtaining use-case-specific responses.

Techniques like Retrieval Augmented Generation (RAG), Model Fine-tuning, and Model Pertaining can be leveraged for further accuracy. However, these advanced methods come with additional costs and complexity, as illustrated in the graph below:

Solution overview

Combining technology and business strategy, the solution facilitates the acquisition and expedites customer growth. The UI-based application features a Q&A screen, allowing customers to ask specific questions about various credit cards, such as benefits like Priority Boarding or Free Checked Bags limits.

Operationalizing seamlessly, Anthropic’ s Claude LLM model processes a document with diverse co-branded credit card options, benefits, features, and terms.

Here is how it works:

1. User requests the GenAI application hosted on the Amazon EKS cluster

2. An Amazon OpenSearch-powered vector store handling embedded data and vector search scans through the document with diverse co-branded credit card options

3. The index returns search results with excerpts of relevant excerpts from the uploaded document

4. The application sends the question and the data retrieved from the index as context

5. The LLM generates a brief response to the user’s request based on the retrieved data

5. The LLM sends the response back to the user

Based on the response, customers can make informed choices tailored to their needs. Additionally, it functions as both a recommendation and Q&A-based application, allowing users to seek card recommendations based on their preferences. The system offers the flexibility to request concise responses by utilizing simple prompts.

A high-level architecture diagram of the solution is given below:

Technical requirements

1. Python version: 3.10

2. LangChain

3. boto3

4. botocore

5. FastAPI

AWS Services used

1. Amazon Bedrock

2. Amazon OpenSearch

3. Amazon Elastic Kubernetes Service (Amazon EKS)

4. Amazon Elastic Container Registry (Amazon ECR)

Key features

Some of the key features of the solution are as follows:

  • Implements a Q&A-based approach by utilizing an LLM with Amazon Bedrock support
  • Utilizes Amazon OpenSearch for searching, visualizing, and analyzing extensive text and unstructured data
  • Facilitates semantic search to locate similar text fragments in the vector space
  • Users can select foundational models based on requirements. The application is implemented with Anthropic’ s Claude functional model (FM) supported by Amazon Bedrock
  • Streamlines chat history management with the LangChain framework

Solution deep dive

Step 1: Create an Amazon Bedrock client

In this step, need to specify the functional model that you want to use. As mentioned earlier, we have leveraged Anthropic’ s Claude FM. The snapshot below details how to create the Amazon Bedrock client:

Step 2: Create the Bedrock Embeddings using Amazon Bedrock client

Using the Amazon Bedrock client, create the BedrockEmbeddings. Embeddings play a pivotal role in the composition of expansive language models (LLMs) and generative artificial intelligence (GenAI), allowing them to represent words and text in a manner that captures their semantic associations and contextual significance. The following snapshot illustrates the process of creating embeddings:

Step 3: Add document upload capability

Add the capability to upload documents to the Amazon OpenSearch vector store using Bedrock Embeddings. For a comprehensive understanding of Amazon OpenSearch’s capabilities and features, please refer to this link. The screenshot below demonstrates the document upload process:

Step 4: Leverage Amazon OpenSearch to read indexed data

Step 5: Application queries FASTAPIs for responses

The application invokes the “/getResponse” endpoint when a user poses a question, as demonstrated in the screenshot below:

Step 6: Retrieve answers from Bedrock Services

APIs fetch answers using Bedrock services and return responses to the user.

Conclusion

The use case demonstrates the transformative impact of GenAI and Amazon Bedrock on customer engagement, showcasing technology’s potential to reshape business connections. Leveraging Amazon Bedrock and Amazon OpenSearch for precise and contextualized answers, the use case highlights how Impetus Technologies exemplified innovation with the Anthropic’ s Claude V2 model. As an AWS-managed service, Amazon Bedrock ensures a seamless conversational AI experience, reflecting Impetus Technologies’ commitment to pioneering solutions with GenAI capabilities.

To learn more about Impetus’ GenAI offerings and solutions, contact us.  

Impetus Technologies – AWS Partner Spotlight

Impetus Technologies is an AWS partner that solves data, AI, and cloud puzzles by combining unmatched ​expertise in cloud and data engineering.

About Authors

Niten Kapoor is a Senior Technical Architect at Impetus Technologies. He has strong expertise in the Big Data and ML/AI area. He is enthusiastic about GenerativeAI and LLM areas. He has a strong application development and architecture background.

Balasaheb P. Patil is a Senior Software Engineer at Impetus Technologies. He has strong experience in application development using AWS Cloud services. He enjoys solving business problems with MLOps and software engineering. In his free time, he enjoys learning about innovative technologies, watching Sci-fi movies, and playing chess.

Authors

Niten Kapoor, Senior Technical Architect Cloud & Data Engineering

Balasaheb Patil, Senior Software Engineer Cloud & Data Engineering

Learn more about how our work can support your enterprise