AWS BedRock - Boto3 Demo - Mistral AI Models

AWS BedRock - Boto3 Demo - Mistral AI Models

Previous Blog on this Learning Series

All 10 Blogs are under Learning series

https://blog.dataopslabs.com/series/aws-bedrock

Github Link - Notebook

https://github.com/jayyanar/learning-aws-bedrock/blob/main/blog12-Mistral/Bedrock_MistralAI_Boto3.ipynb

Environment Setup

I am using vscode local environment with AWS Credential configured.

Install Latest Python

! python --version
Python 3.11.5

Upgrade pip

! pip install --upgrade pip

Install latest boto3,awscli, boto3-core

! pip install --no-build-isolation --force-reinstall \
    "boto3>=1.33.6" \
    "awscli>=1.31.6" \
    "botocore>=1.33.6"

Load the Library

Ensure you configure Region Mistral Available only on us-east-2

import json
import os
import sys

import boto3
import botocore

bedrock = boto3.client(service_name="bedrock", region_name='us-west-2')
bedrock_runtime = boto3.client(service_name="bedrock-runtime", region_name='us-west-2')

Mistral 7B Model

  • Mistral 7B is a 7-billion-parameter language model designed for exceptional performance and efficiency in Natural Language Processing (NLP).

  • It outperforms previous top models such as Llama 2 13B and Llama 1 34B across various benchmarks including reasoning, mathematics, and code generation.

  • Leveraging grouped-query attention (GQA) and sliding window attention (SWA), Mistral 7B achieves superior performance without sacrificing efficiency.

  • Mistral 7B models are released under the Apache 2.0 license, facilitating easy deployment and fine-tuning for diverse tasks.

  • Ref: https://blog.dataopslabs.com/my-understanding-of-mistral-ai

Set the Prompt

mistral_code_prompt = "[INST]You are an Travelling Agent with Lot of Country Monument Places[/INST]\nGive 10 Places in India with numbering, Give me only places dont give more details"

Configure the Model configuration

body = json.dumps({
    "prompt":mistral_code_prompt,
    "max_tokens": 250,
    "temperature": 0.5 #Temperature controls randomness; higher values increase diversity, lower values boost predictability.
})

Invoke the Model

response = bedrock_runtime.invoke_model(
    body=body,
    modelId="mistral.mistral-7b-instruct-v0:2",
    # You can replace above model with meta.llama2-13b-chat-v1 also
    accept="application/json", 
    contentType="application/json"
)

Parse the response for Text Completion

response_output = json.loads(response.get('body').read())
mistral_parse_text = response_output['outputs'][0]['text']
mistral_parse_text = mistral_parse_text.replace('\n', ' ')
mistral7b_output = mistral_parse_text.strip()

# Print the output with new lines wherever "\n" is encountered
print(mistral7b_output)

Text Completion

like history, how to reach, best time to visit, etc. just the names of the monuments or places: 1. Taj Mahal, Agra, Uttar Pradesh 2. Red Fort, Delhi 3. Qutub Minar, Delhi 4. Hampi, Karnataka 5. Mahabodhi Temple, Bodhgaya, Bihar 6. Ajanta and Ellora Caves, Aurangabad, Maharashtra 7. Konark Sun Temple, Konark, Odisha 8. Khajuraho Temples, Madhya Pradesh 9. Meenakshi Amman Temple, Madurai, Tamil Nadu 10. Brihadeeswara Temple, Thanjavur, Tamil Nadu. These are just a few of the many monument places in India that are worth visiting for their historical, cultural, and architectural significance.

Mixtral 8x7B

  • Mixtral 8x7B model, a Sparse Mixture of Experts (SMoE) language model. Mixtral shares the same architecture as Mistral 7B but differs in that each layer consists of 8 feedforward blocks (experts), with a router network selecting two experts to process each token at every layer. Despite having access to 47B parameters, Mixtral only utilizes 13B active parameters during inference. Mixtral outperforms or matches Llama 2 70B and GPT-3.5 across various benchmarks, especially excelling in mathematics, code generation, and multilingual tasks.

  • Ref: https://blog.dataopslabs.com/my-understanding-of-mistral-ai

Set the Prompt

mixtral_code_prompt = "[INST]You are an Best Beach Places finder for Scuba Diving[/INST]\nGive 10 Places in world with numbering, Give me only places dont give more details"

Configure the Model configuration

body = json.dumps({
    "prompt":mixtral_code_prompt,
    "max_tokens": 250,
    "temperature": 0.5 #Temperature controls randomness; higher values increase diversity, lower values boost predictability.
})

Invoke the Model

response = bedrock_runtime.invoke_model(
    body=body,
    modelId="mistral.mixtral-8x7b-instruct-v0:1",
    # You can replace above model with meta.llama2-13b-chat-v1 also
    accept="application/json", 
    contentType="application/json"
)

Parse the response for Text Completion

response_output = json.loads(response.get('body').read())
mixtral_parse_text = response_output['outputs'][0]['text']
mixtral_parse_text = mixtral_parse_text.replace('\n', ' ')
mixtral_output = mixtral_parse_text.strip()

# Print the output with new lines wherever "\n" is encountered
print(mixtral_output)

Text Completion

1. Great Barrier Reef, Australia 2. Palau, Micronesia 3. Blue Hole, Belize 4. Raja Ampat, Indonesia 5. Maldives, Indian Ocean 6. Cozumel, Mexico 7. Galapagos Islands, Ecuador 8. Sipadan Island, Malaysia 9. Fiji, South Pacific 10. Bonaire, Caribbean Netherlands These are some of the top scuba diving destinations in the world. Each location offers unique underwater experiences, such as diverse marine life, coral reefs, clear waters, and exciting dive sites. I highly recommend researching each location to find the one that best suits your interests and skill level.