The Graph
Identified - Some subgraphs may not be being indexed by the Ugprade Indexer. This is due to a issue with some Upgrade Indexer improvements we are are making and will be resolved soon.

If your subgraph is not being indexed by the Upgrade Indexer please reach out to us on Discord or Telegram and we can help you get your subgraphs reallocated.

Nov 18, 2024 - 17:01 UTC
Identified - The issue has been identified and a fix is being implemented.
Aug 30, 2024 - 20:47 UTC

About This Site

Status, incident and maintenance information for The Graph.
Use the RSS feed together with https://zapier.com/apps/rss to have updates sent straight to Slack or Discord.

Network - Gateway ? Operational
90 days ago
100.0 % uptime
Today
Network - Explorer ? Operational
90 days ago
100.0 % uptime
Today
Network - Subgraph Studio ? Operational
90 days ago
100.0 % uptime
Today
Network - Network Subgraph ? Operational
90 days ago
100.0 % uptime
Today
Upgrade Indexer - Queries ? Operational
90 days ago
99.91 % uptime
Today
Upgrade Indexer - Subgraph Health Degraded Performance
90 days ago
100.0 % uptime
Today
Upgrade Indexer - Indexing Major Outage
90 days ago
96.04 % uptime
Today
Mainnet Operational
90 days ago
100.0 % uptime
Today
Sepolia Operational
90 days ago
100.0 % uptime
Today
Holesky Operational
90 days ago
100.0 % uptime
Today
Polygon (Matic) Operational
90 days ago
99.84 % uptime
Today
Polygon Amoy Operational
90 days ago
100.0 % uptime
Today
Polygon zkEvm Operational
90 days ago
100.0 % uptime
Today
Polygon zkEvm Testnet Operational
90 days ago
100.0 % uptime
Today
BSC Operational
90 days ago
100.0 % uptime
Today
BSC Chapel Operational
90 days ago
100.0 % uptime
Today
Avalanche Operational
90 days ago
100.0 % uptime
Today
Avalanche Fuji Operational
90 days ago
100.0 % uptime
Today
xDai Operational
90 days ago
100.0 % uptime
Today
Fantom Operational
90 days ago
100.0 % uptime
Today
Fantom Testnet Operational
90 days ago
100.0 % uptime
Today
Celo Operational
90 days ago
100.0 % uptime
Today
Celo Alfajores Operational
90 days ago
100.0 % uptime
Today
Fuse Operational
90 days ago
100.0 % uptime
Today
Moonbeam Operational
90 days ago
100.0 % uptime
Today
Moonriver Operational
90 days ago
99.63 % uptime
Today
Arbitrum One Operational
90 days ago
99.99 % uptime
Today
Arbitrum Sepolia Operational
90 days ago
100.0 % uptime
Today
Optimism Operational
90 days ago
100.0 % uptime
Today
Optimism Sepolia Operational
90 days ago
100.0 % uptime
Today
Aurora Operational
90 days ago
100.0 % uptime
Today
Aurora Testnet Operational
90 days ago
100.0 % uptime
Today
Arweave ? Operational
90 days ago
100.0 % uptime
Today
Cosmos Major Outage
90 days ago
0.05 % uptime
Today
Osmosis ? Major Outage
90 days ago
76.45 % uptime
Today
zkSync Era Operational
90 days ago
100.0 % uptime
Today
Zksync Era Sepolia Operational
90 days ago
100.0 % uptime
Today
Base Operational
90 days ago
100.0 % uptime
Today
Base Sepolia Operational
90 days ago
100.0 % uptime
Today
NEAR Operational
90 days ago
100.0 % uptime
Today
Near Testnet Operational
90 days ago
100.0 % uptime
Today
Harmony Operational
90 days ago
100.0 % uptime
Today
Scroll Mainnet Operational
90 days ago
100.0 % uptime
Today
Scroll Sepolia Operational
90 days ago
100.0 % uptime
Today
Linea Mainnet Operational
90 days ago
100.0 % uptime
Today
Linea Sepolia Operational
90 days ago
100.0 % uptime
Today
Blast Mainnet Operational
90 days ago
100.0 % uptime
Today
Blast Testnet Operational
90 days ago
100.0 % uptime
Today
Sei Testnet Operational
90 days ago
100.0 % uptime
Today
X Layer Mainnet Operational
90 days ago
100.0 % uptime
Today
X Layer Sepolia Operational
90 days ago
100.0 % uptime
Today
Theta Testnet Major Outage
90 days ago
0.62 % uptime
Today
Astar ZkEVM Mainnet Operational
90 days ago
100.0 % uptime
Today
Astar ZkEVM Sepolia Operational
90 days ago
100.0 % uptime
Today
Etherlink Testnet Operational
90 days ago
100.0 % uptime
Today
Gnosis Operational
90 days ago
100.0 % uptime
Today
Gnosis Chiado Operational
90 days ago
100.0 % uptime
Today
Boba Operational
90 days ago
100.0 % uptime
Today
Mode Mainnet Operational
90 days ago
100.0 % uptime
Today
Mode Testnet Operational
90 days ago
100.0 % uptime
Today
NeoX Operational
90 days ago
100.0 % uptime
Today
NeoX Testnet Operational
90 days ago
100.0 % uptime
Today
Bitcoin Substreams Operational
90 days ago
100.0 % uptime
Today
Sei Mainnet Operational
90 days ago
98.66 % uptime
Today
Upgrade Indexer - Miscellaneous Partial Outage
90 days ago
99.87 % uptime
Today
Deployments ? Partial Outage
90 days ago
99.25 % uptime
Today
IPFS Operational
90 days ago
100.0 % uptime
Today
Subgraph Logs ? Operational
90 days ago
100.0 % uptime
Today
Explorer ? Operational
90 days ago
100.0 % uptime
Today
thegraph.com ? Operational
90 days ago
100.0 % uptime
Today
The Graph Network Subgraph - Arbitrum Operational
90 days ago
100.0 % uptime
Today
Firehose Indexing Operational
90 days ago
100.0 % uptime
Today
Firehose Service Operational
90 days ago
100.0 % uptime
Today
Operational
Degraded Performance
Partial Outage
Major Outage
Maintenance
Major outage
Partial outage
No downtime recorded on this day.
No data exists for this day.
had a major outage.
had a partial outage.
Past Incidents
Nov 20, 2024

No incidents reported today.

Nov 19, 2024
Resolved - This incident has been resolved.
Nov 19, 04:18 UTC
Investigating - [FIRING:1] Etherlink-Sepolia: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Etherlink-Sepolia: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Etherlink-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivi6c5uyod/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DEtherlink-Sepolia%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=142

Nov 19, 03:33 UTC
Nov 18, 2024

Unresolved incident: Potential for subgraphs to be "Unallocated" from the Upgrade Indexer.

Nov 17, 2024

No incidents reported.

Nov 16, 2024

No incidents reported.

Nov 15, 2024
Resolved - This incident has been resolved.
Nov 15, 16:51 UTC
Investigating - [FIRING:1] Arbitrum-Sepolia: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=8, B1=16
Labels:
- alertname = Arbitrum-Sepolia: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Arbitrum-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivgjdk54wd/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DArbitrum-Sepolia%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=139

Nov 15, 16:11 UTC
Nov 14, 2024
Resolved - This incident has been resolved.
Nov 14, 07:47 UTC
Investigating - [FIRING:1] Arweave-Mainnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Arweave-Mainnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Arweave-Mainnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/bdjeividov3lsc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DArweave-Mainnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=108

Nov 14, 03:57 UTC
Resolved - This incident has been resolved.
Nov 14, 02:17 UTC
Investigating - [FIRING:1] Arweave-Mainnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Arweave-Mainnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Arweave-Mainnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/bdjeividov3lsc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DArweave-Mainnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=108

Nov 14, 00:02 UTC
Nov 13, 2024
Resolved - This incident has been resolved.
Nov 13, 23:02 UTC
Investigating - [FIRING:1] Arweave-Mainnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Arweave-Mainnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Arweave-Mainnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/bdjeividov3lsc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DArweave-Mainnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=108

Nov 13, 21:02 UTC
Nov 12, 2024

No incidents reported.

Nov 11, 2024

No incidents reported.

Nov 10, 2024

No incidents reported.

Nov 9, 2024

No incidents reported.

Nov 8, 2024

No incidents reported.

Nov 7, 2024

No incidents reported.

Nov 6, 2024
Resolved - This incident has been resolved.
Nov 6, 20:31 UTC
Update - [Comment from Opsgenie]Theodore Butler acknowledged alert: "[Grafana]: Backstop Indexers: Too many indexer errors
Summary:
Backstop indexer has too many errors - IE009



Backstop Indexers:"

Nov 6, 18:08 UTC
Investigating - [FIRING:2] Backstop Indexers: Too many indexer errors Network (mainnet-indexer-02-europe-west indexer P2-High true Network Engineering)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: A=832, B=832, C=1
Labels:
- alertname = Backstop Indexers: Too many indexer errors
- cluster = mainnet-indexer-02-europe-west
- code = IE009
- component = indexer
- grafana_folder = Network
- priority = P2-High
- statuspage = true
- team = Network Engineering
Annotations:
- Opsgenie =
Backstop indexer has too many errors - IE009

- summary =
Backstop indexer mainnet-indexer-02-europe-west has too many errors - IE009

Source: https://thegraph.grafana.net/alerting/grafana/d94de1c9-f48b-4cab-99de-73ee32c75f12/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DBackstop+Indexers%3A+Too+many+indexer+errors&matcher=cluster%3Dmainnet-indexer-02-europe-west&matcher=code%3DIE009&matcher=component%3Dindexer&matcher=grafana_folder%3DNetwork&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DNetwork+Engineering&orgId=1
Dashboard: https://thegraph.grafana.net/d/nIZKCFx7z?orgId=1
Panel: https://thegraph.grafana.net/d/nIZKCFx7z?orgId=1&viewPanel=68

Value: A=8248, B=8248, C=1
Labels:
- alertname = Backstop Indexers: Too many indexer errors
- cluster = mainnet-indexer-02-europe-west
- code = IE010
- component = indexer
- grafana_folder = Network
- priority = P2-High
- statuspage = true
- team = Network Engineering
Annotations:
- Opsgenie =
Backstop indexer has too many errors - IE010

- summary =
Backstop indexer mainnet-indexer-02-europe-west has too many errors - IE010

Source: https://thegraph.grafana.net/alerting/grafana/d94de1c9-f48b-4cab-99de-73ee32c75f12/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DBackstop+Indexers%3A+Too+many+indexer+errors&matcher=cluster%3Dmainnet-indexer-02-europe-west&matcher=code%3DIE010&matcher=component%3Dindexer&matcher=grafana_folder%3DNetwork&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DNetwork+Engineering&orgId=1
Dashboard: https://thegraph.grafana.net/d/nIZKCFx7z?orgId=1
Panel: https://thegraph.grafana.net/d/nIZKCFx7z?orgId=1&viewPanel=68

Nov 6, 17:04 UTC
Resolved - This incident has been resolved.
Nov 6, 18:23 UTC
Investigating - [FIRING:1] Polygon (Matic): Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Polygon (Matic): Block ingestor lagging behind
- cmp_Polygon (Matic) = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- statuspage = Polygon (Matic): Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ZdG_62MVk/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPolygon+%28Matic%29%3A+Block+ingestor+lagging+behind&matcher=cmp_Polygon+%28Matic%29%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=61

Nov 6, 17:28 UTC
Resolved - This incident has been resolved.
Nov 6, 12:23 UTC
Investigating - [FIRING:1] Polygon (Matic): Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0, B1=0, B2=0
Labels:
- alertname = Polygon (Matic): Block ingestor lagging behind
- cmp_Polygon (Matic) = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- statuspage = Polygon (Matic): Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ZdG_62MVk/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPolygon+%28Matic%29%3A+Block+ingestor+lagging+behind&matcher=cmp_Polygon+%28Matic%29%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=61

Nov 6, 11:53 UTC
Resolved - This incident has been resolved.
Nov 6, 10:25 UTC
Investigating - [FIRING:1] Celo-Alfajores: Block ingestor lagging behind Production (indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Celo-Alfajores: Block ingestor lagging behind
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Celo-Alfajores: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivfxsxpmof/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DCelo-Alfajores%3A+Block+ingestor+lagging+behind&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=76

Nov 5, 17:50 UTC