Use cases
Software Products E-commerce MSPs Schools Development & Marketing DevOps Agencies Help Desk
Company
Internet Status Blog Pricing Log in Get started free

Outage in Elastic Cloud

ECH Customers using LLM chat inference through Elastic Inference Service may be under-billed

Resolved Major
July 09, 2025 - Started 9 months ago - Lasted 2 days
Official incident page

Incident Report

ECH customers using LLM chat inference through the Elastic Inference Service may be under-billed for their usage. The issue has been identified and a fix is under way.

Trusted by 1,000+ teams

The Status Page Aggregator with Early Outage Detection

Stop finding out about outages from your users. Monitor 6,320+ cloud services and get alerted the second something breaks.

Start Free Trial
  • No credit card
  • 14-day trial
  • 2-minute setup
IsDown status aggregator dashboard
Latest Updates ( sorted recent to last )
RESOLVED 9 months ago - at 07/11/2025 10:04PM

The billing process for LLM chat inference has been fixed. ECH customers should now see their inference usage reported normally under their organization's billing page.

IDENTIFIED 9 months ago - at 07/09/2025 07:22PM

ECH customers using LLM chat inference through the Elastic Inference Service may be under-billed for their usage. The issue has been identified and a fix is under way.

The Status Page Aggregator with Early Outage Detection

With IsDown, you can monitor all your critical services' official status pages from one centralized dashboard and receive instant alerts the moment an outage is detected. Say goodbye to constantly checking multiple sites for updates and stay ahead of outages with IsDown.

Start free trial

No credit card required · Cancel anytime · 6320 services available

Integrations with Slack Microsoft Teams Google Chat Datadog PagerDuty Zapier Discord Webhook