OpenAI CEO Sam Altman says the company is ‘out of GPUs’

Spread the love

OpenAI CEO Sam Altman says the company was forced to stumble on its new model rollout, GPT -4.5Because the open is “out of the GPU”.

A X postsUltman said that GPT -1.3, which he describes as “giant” and “expensive”, requires “several thousand” more GPUs before receiving access to additional ChatzPT. GPT -4.5 starts from Thursday to ChatzPT Pro customers first, then ChatzPT Plus customers next week.

Probably due to its huge size, GPT -4.5 is wildly expensive. Open tokens per million tokens (~ 750,000 words) in the model are charging $ 75 and the model produced by the model is $ 150 per million token. It costs 30x inputs and outputs of 15x OpenAI Workharors GPT -4O Model.

“We are growing a lot and are out of the GPU,” Ultman writes. “We will add a few thousand GPUs next week and roll it on the plus tire […] This is not how we want to handle, but it is hard to fully predict the enthusiasm that leads to GPU deficit “

Ultman has Before It is said that the lack of computing capacity is delaying company products. OpenAI is hoping to deal with it in the coming years Is developing its own AI chipsAnd by Creating a huge network of datacentersThe

Leave a Reply

Your email address will not be published. Required fields are marked *