New AI Incident Response, Multi-Region Agents, and Custom-Domain Status Pages — May 2026
Services Pricing Dashboard

Quicknode Outage History

Uptime record, past incidents, and downtime history for Quicknode.

Checking current status...
4.3% uptime over 46 days
99.9% ✗ 99.5% ✗ 99% ✗ 95% ✗

90-Day Trend

Mar 28May 12

Monthly Uptime

Month Uptime Days Tracked Days with Issues
May 2026 0% 12 12
April 2026 0% 30 30
March 2026 50% 4 2

Uptime is calculated from daily worst-status snapshots. A day with any non-operational status counts as a day with issues.

Daily Status (Last 46 Days)

Mar 28 Today
Operational Degraded Partial Outage Major Outage Maintenance No Data

Incident History

May 2026
Fluent Testnet - Stuck Network
minor

Started: May 12, 8:38 AM

investigating
Fluent Mainnet appears to have stalled at block 26,579,757 - The Quicknode team is investigating and will update with more information once it is available.
May 12, 8:38 AM
Degraded Performance - TON Mainnet
minor

Started: May 11, 8:19 PM

monitoring
Nodes have stabilized and 503s are down significantly. We're working with the foundation on a permanent fix.
May 11, 10:16 PM
investigating
The QuickNode team is investigating the degraded performance of TON Mainnet. We will update this page as we acquire new information. Users may experience 503s at this time.
May 11, 8:19 PM
Billing Dashboard Data Delay
minor

Started: May 11, 4:51 PM

identified
We’re currently experiencing a delay in billing-related data appearing in the dashboard. Charges and usage may take longer than usual to reflect, but billing is still processing normally in the background. We’re working to restore normal update times and will post an update once data is fully caught up.
May 11, 4:51 PM
Quicknode dashboard : Investigating Dashboard Loading Issues
minor

Started: May 10, 4:17 PM

monitoring
A fix has been implemented and we are monitoring the results.
May 10, 11:17 PM
investigating
We're currently investigating an issue where some users may experience longer load times on the dashboard, and KV API requests may return 503 errors.
May 10, 4:17 PM
Base Mainnet & Base Sepolia - Degrade Performance
minor

Started: May 10, 1:01 PM

investigating
We are aware of a node peering stability issue affecting Base Mainnet and Testnet. The Base Foundation has published an incident report attributing this to upcoming changes to p2p peer discovery. Quicknode Base Infrastructure may be impacted and users could see intermittent 503 errors or request timeouts during this period. There is no action required on your end. Our team is monitoring the situation closely and will provide updates here as more information becomes available.
May 10, 1:01 PM
Base Sepolia: Degraded Performance
minor

Started: May 10, 12:09 AM

investigating
We are currently investigating an issue affecting archive calls on Base Sepolia. We will provide updates as they become available.
May 10, 12:09 AM
Worldchain Mainnet: Degraded Performance
minor

Started: May 9, 8:51 PM

monitoring
A fix has been implemented and we are monitoring the results.
May 9, 9:06 PM
investigating
We are currently investigating degraded performance on our Worldchain Mainnet nodes. Users may experience 503s and null responses. We will provide updates as they become available.
May 9, 8:51 PM
Quicknode Key-Value Store: Degraded Performance
minor

Started: May 8, 11:15 PM

monitoring
A fix has been implemented and we are monitoring the results.
May 9, 12:04 AM
investigating
We are currently experiencing a service disruption affecting the Quicknode Key-Value Store API. Customers may be unable to access this service. Our team is actively looking into the issue and will provide updates as soon as possible.
May 8, 11:32 PM
Quicknode Streams: Degraded Performance
minor

Started: May 8, 9:54 PM

monitoring
A fix has been implemented and we are monitoring the results.
May 8, 9:59 PM
investigating
We are currently investigating this issue.
May 8, 9:54 PM
Quicknode dashboard : Investigating Dashboard Loading Issues
minor

Started: May 8, 7:45 AM

monitoring
A fix has been implemented and we are monitoring the results.
May 8, 8:52 AM
identified
The issue has been identified, and our engineers are actively working on a resolution. Please expect the next update by 11:00 AM UTC.
May 8, 8:39 AM
investigating
We are currently investigating this issue.
May 8, 8:02 AM
Quicknode streams API, Key Value storage API : Degraded performance
minor

Started: May 8, 7:45 AM

monitoring
A fix has been implemented and we are monitoring the results.
May 8, 9:04 AM
identified
The issue has been identified, and our engineers are actively working on a resolution. Please expect the next update by 11:00 AM UTC.
May 8, 8:39 AM
investigating
We are currently investigating this issue.
May 8, 8:16 AM
XRP testnet : Nodes are stalled at 17186304
minor

Started: May 8, 3:16 AM

identified
Our engineers are closely listening to updates from the foundation.
May 8, 4:01 AM
investigating
We are currently investigating this issue.
May 8, 3:16 AM
Degraded Performance - Polygon Amoy
minor

Started: May 7, 8:47 PM

identified
The Quicknode team is aware of the degraded performance of the Polygon Amoy Network. The foundation has notified us they are performing stress tests on the network. Users may experience elevated latencies, timeouts, and 503 errors at this time.
May 7, 8:47 PM
Degraded Performance - Ethereum Mainnet
minor

Started: May 7, 4:05 PM

monitoring
A fix has been implemented and we are monitoring the results.
May 7, 5:02 PM
investigating
The Quicknode team is investigating the degraded performance of specific methods such as eth_getLogs on Ethereum Mainnet - we will update this page as we acquire new information. Users may experience elevated latency and timeouts at this time.
May 7, 4:05 PM
Arbitrum Mainnet - Sporadic API Errors
minor

Started: May 7, 2:37 PM

monitoring
A fix has been implemented and we are monitoring the results.
May 7, 5:23 PM
investigating
The Quicknode team is investigating sporadic API errors being served on Arbitrum Mainnet - we will update this page as we acquire new information. Users may experience sporadic 503 errors at this time.
May 7, 2:37 PM
Degraded Performance - HyperEVM Testnet
minor

Started: May 4, 9:17 PM

monitoring
A fix has been implemented and we are monitoring the results.
May 4, 9:29 PM
investigating
The QuickNode team is investigating the degraded performance of HyperEVM Testnet. We will update this page as we acquire new information. Users may experience 503 issues at this time.
May 4, 9:17 PM
Degraded Performance - Tron Mainnet FRA Region
minor

Started: May 3, 9:09 PM

investigating
The QuickNode team is investigating the degraded performance of Tron Mainnet in FRA region. We rerouted FRA traffic to a nearby region while we continue to investigate. Users may experience heightened latency. We will update this page as we acquire new information.
May 3, 9:09 PM
Sei Pacific : Degraded Performance
minor

Started: May 2, 5:08 PM

identified
Our engineers are closely listening to updates from the foundation. We will provide another update on or before 8th May, 17:00 UTC
May 6, 5:58 AM
identified
The Quicknode team is actively working to resolve the degraded performance affecting Sei Pacific. We are coordinating closely with the Sei team to address the issue. This page will be updated as new information becomes available. Users may continue to experience elevated latency and intermittent timeouts at this time.
May 2, 7:22 PM
identified
The Quicknode team is continuing to mitigate the degraded performance of Sei Pacific. We will update this page as we acquire new information. Users may continue to experience heightened latency and timeouts at this time.
May 2, 6:30 PM
investigating
The Quicknode team is investigating the degraded performance of Sei Pacific - we will update this page as we acquire new information. Users may experience heightened latency and timeouts at this time.
May 2, 5:08 PM
Stuck Network - Polygon zkEVM
minor

Started: May 1, 2:24 PM

monitoring
A fix has been implemented and we are monitoring the results.
May 1, 4:37 PM
investigating
The Polygon zkEVM network appears to have stalled. The Quicknode team is investigating and will update with more information once it is available.
May 1, 2:24 PM
Worldchain Mainnet: Block Height Stalled at 29,128,981
minor

Started: May 1, 1:30 AM

monitoring
A fix has been implemented and we are monitoring the results.
May 1, 4:19 AM
identified
The issue has been identified and a fix is being implemented.
May 1, 4:18 AM
investigating
We are currently investigating this issue.
May 1, 12:56 AM
April 2026
Stuck Network - Blast Sepolia
minor

Started: Apr 29, 4:52 PM

investigating
The Blast Sepolia network appears to have stalled at block 36373475 - The Quicknode team is investigating and will update with more information once it is available.
Apr 29, 4:52 PM
Degraded Performance - Polygon Mainnet
minor

Started: Apr 29, 3:45 PM

monitoring
A fix is in place, and we are monitoring performance to ensure stability and will provide an update upon confirmation or if further action is required within 10 minutes.
Apr 29, 6:55 PM
identified
Quicknode Polygon Mainnet endpoints may experience degraded performance, including intermittent 503 responses, increased latency, and nodes lagging behind tip while checkpoint/milestone processing is disrupted. Our team is actively applying mitigations and working with upstream guidance to restore normal service, and we will post further updates to this incident as more information becomes available.
Apr 29, 6:32 PM
investigating
The Quicknode team is continuing to investigate the degraded performance of Polygon Mainnet and we are working closely with the foundation to resolve the issue. We will update this page as we acquire new information. Users may be served 503s at this time.
Apr 29, 4:31 PM
investigating
The Quicknode team is investigating the degraded performance of Polygon Mainnet - we will update this page as we acquire new information. Users may be served 503s at this time.
Apr 29, 3:45 PM
Solana Priority Fee API: Critical Outage
critical

Started: Apr 29, 12:20 AM

investigating
Quicknode Solana Priority Fee API is currently experiencing an outage. Requests to the API will fail at this time. Our team is actively investigating the issue with priority and will follow up as soon as new information becomes available.
Apr 29, 12:20 AM
investigating
We are currently investigating this issue.
Apr 29, 12:18 AM
Osmosis Mainnet : Degraded Performance

Started: Apr 28, 2:15 AM

monitoring
The Quicknode team is actively working with the foundation team on this. Nodes have recovered, but you may experience intermittent degraded performance for gRPC calls. We are monitoring performance to ensure stability.
Apr 30, 7:06 AM
identified
The issue has been identified, and a fix is being implemented. During this time, you may experience intermittent failures or degraded performance for gRPC calls.
Apr 28, 2:26 AM
monitoring
A fix has been implemented and we are monitoring the results.
Apr 28, 1:07 AM
identified
The Quicknode team is aware of the degraded performance on Osmosis Mainnet. This issue has been identified and our team is actively working on resolving the issue. Users may experience sporadic 503 errors during this time. We will update this page as we acquire new information.
Apr 23, 4:48 PM
Polkadot Mainnet FRA Region - Degraded Performance
minor

Started: Apr 27, 8:09 PM

identified
The issue has been identified and a fix is being implemented.
Apr 27, 8:35 PM
investigating
The QuickNode team is investigating degraded performance of Polkadot Mainnet in FRA region. We are seeing elevated 503 errors at this time.
Apr 27, 8:09 PM
Fantom mainnet- Block Height Stall
minor

Started: Apr 27, 12:01 PM

identified
Engineers have identified the issue and are actively working on a fix for Fantom nodes where block height has stalled. Users may experience stale responses or inconsistent block data in the meantime. We’ll provide another update as progress is made.
Apr 27, 6:10 PM
investigating
We are aware of an issue affecting Fantom nodes where block height has stalled. Our team is currently investigating the root cause. Users may experience stale responses or inconsistent block data. We apologize for any inconvenience and will share an update as soon as we have more information.
Apr 27, 12:01 PM
Identified – Singapore Region Node Disruption
minor

Started: Apr 26, 2:02 PM

monitoring
The situation is now stabilized. Affected nodes in the Singapore region are back online and services are recovering as expected. Our team continues to monitor the environment closely to ensure full stability. We will confirm full resolution once all nodes have been verified healthy.
Apr 26, 3:32 PM
investigating
We have identified a disruption affecting nodes hosted in the Singapore region. The issue was caused by a shutdown of several instances by our cloud infrastructure provider. Affected nodes are coming back online and re-syncing. Our team is actively monitoring recovery.
Apr 26, 2:02 PM
Solana Testnet Scheduled Maintenance (Apr 23, 2026 14:00 UTC)

Started: Apr 24, 3:17 PM

monitoring
The Solana Testnet Network has restarted following the network upgrade. We are monitoring for stability.
Apr 24, 3:17 PM
Quicknode Streams : Degraded performance
minor

Started: Apr 24, 2:45 AM

investigating
We are currently investigating this issue.
Apr 24, 4:02 AM
Solana Mainnet FRA Region - Degraded Performance
minor

Started: Apr 22, 11:01 PM

monitoring
A fix has been implemented and we are monitoring the results.
Apr 23, 6:40 AM
investigating
Quicknode team is investigating degraded performance of Solana Mainnet in Frankfurt region. We will update this page as we acquire new information. Users may experience temporary 503 responses at this time.
Apr 22, 11:01 PM
Degraded Performance - Polkadot Mainnet NRT Region
minor

Started: Apr 22, 8:10 PM

monitoring
A fix has been implemented and we are monitoring the results.
Apr 22, 8:44 PM
investigating
The Quicknode team is investigating the degraded performance of Polkadot Mainnet in the NRT region. We will update this page as we acquire new information. Users may be served 503s at this time in the affected region.
Apr 22, 8:10 PM
Degraded Performance - Polygon Mainnet NRT Region
minor

Started: Apr 22, 3:09 PM

monitoring
A fix has been implemented and we are monitoring the results.
Apr 22, 3:33 PM
investigating
The Quicknode team is investigating the degraded performance of Polygon Mainnet in the NRT region - we will update this page as we acquire new information. Users may be served 503s at this time in the affected region.
Apr 22, 3:09 PM
Degraded Performance - Solana Mainnet
minor

Started: Apr 21, 4:12 PM

monitoring
A fix has been implemented and we are monitoring the results.
Apr 21, 4:30 PM
investigating
The QuickNode team is investigating the degraded performance of Solana Mainnet - we will update this page as we acquire new information. Users may be served 503 responses at this time.
Apr 21, 4:12 PM
Solana Mainnet Degraded Performance
minor

Started: Apr 21, 1:28 AM

monitoring
A fix has been implemented and we are monitoring the results.
Apr 21, 2:53 AM
identified
The issue has been identified and a fix is being implemented.
Apr 21, 2:53 AM
investigating
We are continuing to investigate this issue. We will post another update at 02:30 UTC.
Apr 21, 2:02 AM
investigating
The QuickNode team is investigating the degraded performance of Solana Mainnet. We are seeing elevated 503 errors at this time.
Apr 21, 1:28 AM
Base Sepolia - Increase in `instrinsic gas too high` Errors on eth_estimateGas calls
minor

Started: Apr 20, 7:43 PM

monitoring
Our engineers completed upgrading the Base Sepolia node infrastructure to 0.7.6 https://github.com/base/base/releases/tag/v0.7.6
Apr 21, 8:11 AM
identified
Foundation cut a new release with a couple of bug fixes for Azul (latest hardfork) on Base Sepolia. https://github.com/base/base/releases/tag/v0.7.6
Apr 21, 8:09 AM
investigating
We are investigating an increase in `instrinsic gas too high` API errors on `eth_estimateGas` requests on Base Sepolia that started with the Sepolia Azul Upgrade at 18 UTC today. We are working with the Base team to address this issue and will update this status with findings and resolution ASAP.
Apr 20, 7:43 PM
TON Mainnet Degraded Performance

Started: Apr 20, 9:48 AM

monitoring
The Quicknode team has been upgrading nodes to the latest version from the foundation team and is monitoring performance to ensure stability.
Apr 30, 7:01 AM
identified
The Ton Foundation has provided a new upgrade today to address the issues regarding Lite Servers experiencing synchronization instability. Our team will be upgrading to the newest version and we will update this page with further updates as we progress.
Apr 28, 3:42 PM
identified
The TON Foundation has identified the root cause as liteserver synchronization instability affecting multiple operators. An interim update has been made available for Lite Server and Full Node operators to improve stability and reduce sync issues but a full resolution requires a validator update from the TON Foundation, which is expected on April 28. We will continue to update this page as the situation progresses.
Apr 23, 11:34 AM
identified
Our engineers are working closely with the TON Foundation. We are awaiting a new TON client upgrade release from the foundation that will resolve the issue. We will update this page once we receive further updates from the foundation.
Apr 21, 5:52 PM
identified
Our engineers are working closely with the TON Foundation. We’ll provide the next update by 5 PM UTC.
Apr 21, 8:13 AM
identified
The TON Foundation has confirmed this is an upstream network issue affecting multiple providers. Their engineering team has been notified and is actively working on a resolution. We will continue to monitor the situation and provide updates as soon as we have more information.
Apr 20, 11:50 AM
investigating
We are aware of an issue affecting TON Mainnet endpoints. Some users may experience elevated error rates, increased latency, or failed requests. Our team is actively investigating the root cause and working to restore full service as quickly as possible. We apologize for any inconvenience and will post updates as the situation develops.
Apr 20, 9:48 AM
Stacks Testnet: Block Height Stalled at 3,959,436
minor

Started: Apr 18, 10:52 PM

investigating
We are currently investigating this issue. Users requesting data beyond this block will experience 503s. The Quicknode team is investigating and will update with more information once it is available.
Apr 18, 10:52 PM
Polygon Amoy: Archive Nodes Stalled at Block 36,805,071
minor

Started: Apr 17, 9:38 PM

investigating
Polygon Amoy archive nodes are currently stalled at block 36,805,071. Requests for historical data following this block will result in 503s. Users querying tip data are not affected. Our team is actively investigating the cause with priority and we'll provide updates as new information becomes available.
Apr 17, 9:38 PM
Base Sepolia - Stuck Network at Block 40297498
minor

Started: Apr 16, 6:30 PM

investigating
The Base Sepolia Network appears to have stalled at block 40297498 - This is not limited to Quicknode and is a network wide stall. The Quicknode team is investigating and will update with more information once it is available. You can follow the official Base Status Page for further information: https://status.base.org/incidents/l6d3mnl0c9hs
Apr 16, 6:30 PM
Quicknode Platform Services - Service Disruption

Started: Apr 15, 12:19 PM

monitoring
Update: The issue affecting Dashboard, Streams, Webhooks, and KV REST APIs appears to be resolved. Services have been restored and are operational. Our DevEx team is actively monitoring to ensure full service stability. We will provide a final update once monitoring is complete.
Apr 15, 12:31 PM
investigating
We are currently experiencing service disruptions affecting multiple Quicknode platform services. Affected services include Dashboard, Streams, Webhooks, and KV REST APIs. Customers may be unable to access these services. Quicknode RPC endpoints remain fully operational. Only platform management services are affected. Our engineering team is actively investigating and working to restore these services. We will provide the next update by 2PM UTC or sooner.
Apr 15, 12:19 PM
Stuck Network - Solana Devnet

Started: Apr 14, 4:30 AM

monitoring
Some nodes have recovered, and 503s have been mitigated. Quicknode teams are continuing to work to recover full capacity and monitor performance to ensure stability.
Apr 14, 7:42 AM
identified
The QuickNode team is currently following the foundation team's guidance to recover the nodes and will provide further updates as more information becomes available.
Apr 14, 5:00 AM
investigating
Solana Devnet is currently stalled at block 455,406,479. This has been confirmed as a network-wide issue and is expected, according to the foundation team. QuickNode is awaiting further guidance and will provide updates as more information becomes available.
Apr 14, 4:30 AM
Investigating | Dashboard, Streams & KV Store REST API
minor

Started: Apr 12, 8:55 AM

monitoring
The issue intermittently affecting the QuickNode Dashboard, Streams, and KV Store REST API has been resolved. Services have recovered and are operating normally. Our team will continue to monitor the situation closely to ensure stability is maintained. We apologize for any inconvenience this may have caused.
Apr 12, 9:45 AM
investigating
We are currently investigating an issue intermittently affecting the QuickNode Dashboard, Streams, and KV Store REST API. Some customers may experience degraded performance or errors. Our team is actively working to identify the root cause. We will provide updates as more information becomes available.
Apr 12, 8:55 AM
Degraded Performance - Polygon Amoy Debug and Archive Requests
minor

Started: Apr 10, 3:52 PM

monitoring
A fix has been implemented and we are monitoring the results.
Apr 10, 6:20 PM
identified
The Quicknode team has identified the cause of the degraded performance of Polygon Amoy archive and debug requests and our engineers are applying a fix. Users may still experience 503s during this time for archive and debug RPC calls.
Apr 10, 4:34 PM
investigating
The QuickNode team is investigating the degraded performance of Polygon Amoy archive and debug requests - we will update this page as we acquire new information. Users may experience 503s during this time for archive and debug RPC calls.
Apr 10, 3:52 PM
Fuel Sepolia - Stuck Network
minor

Started: Apr 9, 10:10 PM

investigating
Fuel Sepolia network appears to have stalled at block 59,643,457. The QuickNode team is investigating and will update with more information once it is available.
Apr 9, 10:10 PM
Morph Mainnet – Block Production Halted
minor

Started: Apr 8, 8:38 AM

monitoring
Block production on Morph Mainnet has resumed and nodes are currently catching up and syncing. The Morph Foundation has acknowledged an issue with the sequencer as the root cause of the halt. We are continuing to monitor until full sync is confirmed.
Apr 8, 9:18 AM
investigating
We are currently experiencing a service disruption on Morph Mainnet. Block production has halted due to an issue with the sequencer. This incident has been acknowledged by the Morph Foundation, who are actively investigating. We will provide updates as the situation develops.
Apr 8, 8:38 AM
Morph mainnet : Block height stalled at 22013628

Started: Apr 8, 6:15 AM

identified
This is a chain-level incident. As our engineers are actively monitoring updates from the Morph Foundation, please find the next update by 8 AM UTC.
Apr 8, 6:28 AM
investigating
We are currently investigating this issue.
Apr 8, 6:19 AM
Degraded Performance - Hedera Mainnet
minor

Started: Apr 7, 2:41 PM

investigating
The Quicknode team is investigating the degraded performance of Hedera Mainnet - we will update this page as we acquire new information. Users may experience API errors and 503 responses during this time.
Apr 7, 2:41 PM
Stuck Network - Blast Sepolia
minor

Started: Apr 4, 2:29 PM

monitoring
A fix has been implemented and we are monitoring the results.
Apr 7, 12:07 AM
identified
This is a chain-wide stall. Quicknode teams are actively working with the Foundation to restore normal block progression. We will provide an update as more information becomes available.
Apr 5, 7:52 AM
investigating
The Blast Sepolia Network is experiencing intermittent stalls while it slowly syncs to chain tip - The Blast team is already aware of the issue and they are investigating to resolve the issue. The Quicknode team will update this page once further updates become available.
Apr 4, 2:29 PM
B3 Sepolia: Network-wide stall at block height 61,001,532
minor

Started: Apr 3, 12:42 AM

investigating
B3 Sepolia is experiencing a network-wide stall at block height 61,001,532. This is not isolated to Quicknode. We will provide the next update as soon as new information becomes available. Explorer: https://sepolia.explorer.b3.fun/
Apr 3, 12:42 AM
Sui Testnet: Network-wide stall at block height 321,596,099
minor

Started: Apr 2, 10:25 PM

investigating
Sui Testnet is experiencing a network-wide stall at block height 321,596,099. This is not isolated to Quicknode. We will provide the next update as soon as new information becomes available. Explorer: https://testnet.suivision.xyz/
Apr 2, 10:25 PM
Stuck Network - 0G Galileo
minor

Started: Apr 1, 4:16 PM

investigating
The 0G Galileo testnet network appears to have stalled at block 25181554 - The stall is network wide and is not isolated to Quicknode. Our team has reached out to the foundation for further information. This page will be updated as we receive further information.
Apr 1, 4:16 PM
Cosmos Mainnet Degraded performance
minor

Started: Apr 1, 3:09 PM

monitoring
We have implemented a fix, and service is recovering. We are actively monitoring the situation and will share additional updates as we confirm continued stability
Apr 1, 5:08 PM
investigating
On Cosmos, customers may experience degraded performance for both HTTP and WebSocket traffic, including increased error rates (ex, 503 responses), elevated latency, and intermittent request failures. To resolve this, we are performing an upgrade and working to restore normal service as quickly as possible. We will share the next update as soon as we confirm the cause and mitigation steps, and we’ll continue to provide updates as more information becomes available.
Apr 1, 3:09 PM
Fluent Testnet: Block Height Stalled
minor

Started: Apr 1, 2:50 PM

monitoring
The block height stall on Fluent Testnet has been addressed and all nodes are now synced and producing blocks normally. We are monitoring to ensure stability. We will provide a final update shortly to confirm full resolution.
Apr 1, 3:44 PM
investigating
We are aware of a block height stall affecting Fluent Testnet. Requests may return stale or inconsistent data during this time. We are coordinating closely with the Fluent chain team to identify and resolve the underlying cause. Our engineering team is actively working to restore normal service. Recommended action: No action is required on your end. Requests to Fluent Testnet may return outdated block data until the issue is resolved. We will provide a further update within the next 60 min...
Apr 1, 2:50 PM
March 2026
Gnosis Mainnet Archive RPC Degradation
minor

Started: Mar 30, 7:31 PM

monitoring
A fix has been implemented and we are monitoring the results. We will follow up with an update by 21:00 UTC.
Mar 30, 8:31 PM
identified
Our team has identified the issue with the affected nodes and are currently working on recovery. We will follow up with an update by 20:30 UTC.
Mar 30, 8:03 PM
investigating
We are investigating an issue impacting Gnosis (GNO) Mainnet Archive RPC where some requests are returning elevated HTTP 503 errors and may experience degraded availability. The team is actively working to identify the cause and restore normal service. We will follow up with an update by 20:00 UTC.
Mar 30, 7:31 PM
Bitcoin Cash (BCH) Block Production Stall
minor

Started: Mar 30, 11:53 AM

investigating
We are currently investigating a block production stall on the Bitcoin Cash (BCH) mainnet. The network appears to have stopped producing new blocks, which is also confirmed by public explorers showing the same block height. We are monitoring the situation closely and will provide updates as they become available.
Mar 30, 11:53 AM