During the re:Invent 2023 conference, AWS announced significant updates to its SageMaker, Bedrock, and database services to enhance its generative AI capabilities.
Swami Sivasubramanian, AWS vice president of data and AI, revealed updates to existing foundation models within the Amazon Bedrock, an AI application-building service. New models added to Bedrock include Anthropic’s Claude 2.1, Meta Llama 2 70B, Titan Text Lite, and Titan Text Express. Additionally, AWS introduced a preview model, Amazon Titan Image Generator, which can rapidly generate images based on complex prompts.
The Titan Image Generator has an invisible watermark to identify AI-generated images and reduce disinformation. Furthermore, AWS is making its Amazon Titan Multimodal Embeddings available, which converts images and short text into embeddings for semantic understanding and relationship storage.
Another feature added is Model Evaluation on Amazon Bedrock, allowing enterprises to evaluate, compare, and select the best foundational model for their use case and business needs. This feature simplifies identifying benchmarks, setting up evaluation tools, and running assessments.
For large language models, AWS introduced SageMaker HyperPod and SageMaker Inference within its Amazon SageMaker service. HyperPod reduces training time by up to 40% and ensures uninterrupted model training, while SageMaker Inference helps enterprises reduce deployment costs and decrease latency in model responses.
Furthermore, AWS updated its low code machine learning platform, SageMaker Canvas, which now supports LLMs from Anthropic, Cohere, and AI21 Labs. SageMaker also features the Model Evaluation capability, now called SageMaker Clarify, and updated support for vector databases for Amazon Bedrock, including Amazon Aurora, MongoDB, Pinecone, Redis Enterprise Cloud, and Vector Engine for Amazon OpenSearch Serverless.
Copyright © 2023 IDG Communications, Inc.