OpenAI models coming to Amazon Bedrock: Interview with OpenAI and AWS CEOs

284 points - yesterday at 7:24 PM

Source

Comments

spindump8930 yesterday at 8:17 PM
Remember that models on different inference platforms might not necessarily give exactly the same results, adding another axis of non-determinism to development. Things like quantization, custom model serving silicon, batching, or other inference optimizations might mean a model from the original provider performs differently from the hosted one :/

This paper isn't the exact same scenario, since it's an auditable open weight llama model, but shows the symptoms of this: https://arxiv.org/pdf/2410.20247

zmmmmm yesterday at 9:38 PM
Availability through Bedrock has been a major driver in use of Anthropic in my org. And I am betting there is actual margin in it as well.

I wonder if this is directly linked to the split up with Microsoft. Just from my anecdata, OpenAI is getting completely ignored in serious enterprise deployments because what they offer on Azure sucks and there is no other corporate friendly way to get it. They probably saw themselves getting destroyed in enterprise and realised it was existential to be able to compete with Anthropic on AWS.

jasobake yesterday at 8:08 PM
As someone who works at big tech and spends countless hours in meetings hoping to get some small feature coordinated for deployment across two teams, I can't imagine the amount of meetings and 6-pagers that were involved in running these models on bedrock's hardware.
vicchenai yesterday at 10:47 PM
The enterprise sales motion here is interesting. A lot of regulated industries (finance, healthcare) have existing AWS contracts with data residency commitments baked in. OpenAI on Bedrock basically lets those orgs skip the separate DPA negotiation with OpenAI. Could be a bigger unlock than it looks on paper.
epistasis yesterday at 8:22 PM
Claude got a looooot more buy in with a lot of privacy-concerned orgs I work with because they could access it through their "trusted" intermediate Amazon. OpenAI has been banned and is not trusted. I'm not sure that I agree with these orgs' legal teams' assessments, but they definitely read the terms of service far closer than I did.

We will see if this changes the equation, but it feels like OpenAI is pretty far behind and playing catch up on all fronts. Though to be honest, "pretty far behind" is like 2-8 weeks in the AI world, so it may not matter a ton, it's mostly perception. And for me and my information bubble, perception of OpenAI is rock-bottom due to Sam Altman. From appearing unethical to appearing unhinged with demands from fabs and everything else, I'm not a fan.

NikolaosC today at 8:42 AM
OpenAI just gave up Azure exclusivity, killed the AGI clause, and stopped paying Microsoft revenue share to get on AWS. Anthropic figured out 18 m ago that enterprises buy from their cloud, not from the best model. OpenAI is just catching up.
nijave yesterday at 8:27 PM
This would be a nice compliance win. One less sub-processor and all our data is already on AWS so less worrying about sending it off somewhere else
dear_prudence today at 10:39 AM
Really useful for us as we heavily rely on AWS credits for our AI usage.
quibono today at 8:51 AM
Just waiting for Gemma 4, DeepSeek 4 now. Then the only thing I'll be able to complain about is the completely different API to interface with (unless they FINALLY move to full OpenAI support).
KaiserPro yesterday at 9:34 PM
Great, I can now buy openAI through AWS with an interface that is totally incompatible with all my tools (unless AWS have finally given up and just made bedrock useful by adopting openAPI finally)
2001zhaozhao yesterday at 8:57 PM
The market might be increasingly hard on AI startups in general as enterprises adopt providers like Amazon Bedrock and refuse to sign other deals.
lwarfield yesterday at 8:09 PM
Well that didn't take long.
throw03172019 yesterday at 7:48 PM
OpenAI frontier models coming to Bedrock soon?
chopete3 today at 6:07 AM
This is a big news for AWS hosted products.

Microsoft Azure has been the worst interms of maintaining a highly available service and also managing predictable latency.

Their azure customer support is bad. Not ready for any real enterprise cloud offering. They behave like Comcast customer support.

It was absolutely idiotic to lock it down to Azure. It wasn't meant to be an iphone+at&t combo where the phone is an end all be all.

A cloud product depends on a lot of services and nobody would switch cloud providers for a candy.

echelon yesterday at 8:25 PM
This doesn't mean you have the raw model weights, right? That's still entirely hidden / opaque?

You can just run "air gapped" inference?

Is this only of interest to enterprise customers already on AWS (who want "air gapped" behavior)? Is there any other use case for this?

This will be more expensive than calling OpenAI directly, right?

mochow13 today at 12:30 AM
OpenAI is tailgating Anthropic apparently.
Daffrin today at 10:51 AM
[flagged]
tokenhub_dev today at 7:17 AM
[flagged]
shevy-java yesterday at 9:28 PM
Now they are ruining amazon too. It's fascinating to see.

AI is kind of like the ultimate corporation drug. They are all on it. And can't get rid of it - ever again.

try-working yesterday at 11:55 PM
OpenAI marching towards its future as a dumb pipe.