Serverless Inference Issue
Incident Lifecycle
Incident Timeline
Monitoring
A fix has been implemented and we are monitoring the results.
Apr 6, 2026 at 3:15 PM UTC
Investigating
Our Engineering team is investigating an issue with Serverless inference.
At this time, users may experience high error rates for open source models (llama 3.3 70b).
We apologize for the inconvenience and will share an update once we have more information.
At this time, users may experience high error rates for open source models (llama 3.3 70b).
We apologize for the inconvenience and will share an update once we have more information.
Apr 6, 2026 at 12:28 PM UTC
Was your business affected by this DigitalOcean outage?
Set up instant alerts for DigitalOcean — be the first to know about outages via email, Slack, Teams, or Discord.