Inference API Pricing for PRO subscription
For the meta-llama-3.3-70b model, I had made about 170 requests on 20th/21st March and it had used up $0.02 of my $2 free limit. For the same model and similar amount of tokens, when I made about 30 requests on 2nd April, it charged me $0.50 instead. The first 170 requests are labelled "paid" and the recent 10 say "Pending" . My subscription started from 20th March so it should be valid until 20th April, right? What is going on?
This second one shows $0.58 for just 30 more than the previous one, and says it will be billed at the next renewal of the PRO subscription?
Great discussion! For anyone wanting to quickly test this, Crazyrouter offers API access to this model. No infrastructure setup needed — just an API key and the standard OpenAI SDK.
