In the ever-evolving landscape of artificial intelligence (AI), Amazon Web Services (AWS) remains a key player, providing some of the most advanced foundation models (FMs) available in the industry. AWS continues to expand its offerings, ensuring developers and businesses have access to cutting-edge models that can propel their operations forward. Today, we delve into the latest update from AWS: the introduction of two new OpenAI models with open weights, available on Amazon Bedrock and Amazon SageMaker JumpStart.
The Latest OpenAI Models on AWS
AWS has introduced two new models, gpt-oss-120b and gpt-oss-20b, which are now accessible through Amazon Bedrock and Amazon SageMaker JumpStart. These models are designed to excel in text generation and reasoning tasks. This development offers developers and organizations the flexibility to create AI applications while maintaining full control over their infrastructure and data.
Both models are particularly adept at tasks involving coding, scientific analysis, and mathematical reasoning. They provide a 128K context window and offer adjustable reasoning levels—low, medium, or high—tailored to specific use case requirements. Furthermore, these models can integrate with external tools to enhance capabilities, making them suitable for agentic workflows using frameworks like Strands Agents.
Leveraging Amazon Bedrock and SageMaker JumpStart
Amazon Bedrock and SageMaker JumpStart empower users to innovate by providing access to a wide range of foundation models from leading AI providers, including the newly introduced OpenAI models. This comprehensive selection ensures that users can find the perfect model for their AI workloads, driving efficiency and efficacy in their operations.
Amazon Bedrock allows users to experiment with various models seamlessly, mix and match capabilities, and switch between providers without the need to rewrite code. This flexibility turns model choice into a strategic advantage, enabling continuous evolution of AI strategies as new innovations emerge. Initially, these new models are accessible via an OpenAI-compatible endpoint on Bedrock. Developers can utilize the OpenAI SDK or the Bedrock InvokeModel and Converse API to engage with these models.
With SageMaker JumpStart, users can quickly evaluate, compare, and customize models to suit their specific needs. The platform enables easy deployment of both the original and customized models in production through the SageMaker AI console or the SageMaker Python SDK.
Practical Application of OpenAI Models in Amazon Bedrock
To begin using the OpenAI open weight models in Amazon Bedrock, users need to access the Amazon Bedrock console and navigate to the ‘Model access’ section under ‘Configure and learn.’ Here, they can request access to the two listed OpenAI models.
Once access is granted, users can experiment with the models using the ‘Chat/Test’ playground. By selecting the gpt-oss-120b model, users can test its capabilities with sample prompts. For instance, a prompt concerning financial decision-making for a family vacation fund can generate detailed output, showcasing the model’s reasoning process.
Developers can configure the OpenAI SDK by setting the API endpoint and using an Amazon Bedrock API key for authentication. This setup allows for model invocation using the OpenAI Python SDK, facilitating the creation of AI agents with frameworks compatible with the Amazon Bedrock API or OpenAI API.
Exploring OpenAI Models in Amazon SageMaker JumpStart
In the Amazon SageMaker AI console, users can utilize OpenAI open weight models within the SageMaker Studio. Setting up a SageMaker domain is the first step, with options for single-user or organizational setups. Once set up, users can access detailed descriptions of the gpt-oss-120b and gpt-oss-20b models.
After selecting the gpt-oss-20b model, users can deploy it by choosing the instance type and initial instance count. This deployment creates an endpoint that users can invoke in SageMaker Studio or with any AWS SDKs.
Key Information and Considerations
The OpenAI open weight models introduced by AWS are available in specific AWS Regions. Amazon Bedrock supports these models in the US West (Oregon) region, while SageMaker JumpStart supports them in US East (Ohio, N. Virginia) and Asia Pacific (Mumbai, Tokyo).
These models offer full chain-of-thought output capabilities, providing transparency into the model’s reasoning process—a valuable feature for applications requiring high levels of interpretability and validation. Additionally, the models are highly customizable, allowing users to fine-tune them for unique use cases, integrate them into existing workflows, and build upon them for industry-specific applications.
Security and safety are integral to these models, with comprehensive evaluation processes and safety measures in place. They maintain compatibility with the standard GPT-4 tokenizer, ensuring seamless integration into existing AI ecosystems.
Whether through the serverless experience of Amazon Bedrock or the extensive machine learning development capabilities of SageMaker JumpStart, users can choose their preferred environment for deploying these models. For information on associated costs, users can visit the Amazon Bedrock pricing and Amazon SageMaker AI pricing pages.
Conclusion
The introduction of OpenAI open weight models on AWS marks a significant advancement in the AI landscape, offering developers and organizations powerful tools to enhance their operations. With the ability to innovate and tailor models to specific needs, AWS continues to be a leader in providing cutting-edge AI solutions. For more detailed information, users can explore the parameters for the models and the chat completions API in the Amazon Bedrock documentation.
To get started with these innovative models, users can access the Amazon Bedrock console or the Amazon SageMaker AI console today. For further reading on this topic, the AWS Artificial Intelligence Blog provides additional insights and updates on the availability of GPT OSS models from OpenAI on SageMaker JumpStart.
For more information, visit the AWS website.
For more Information, Refer to this article.


































