The Graph
Investigating - [FIRING:1] Osmosis-1: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Osmosis-1: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Osmosis-1: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivf1h90jka/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DOsmosis-1%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=109

Oct 20, 2024 - 01:48 UTC
Identified - The issue has been identified and a fix is being implemented.
Aug 30, 2024 - 20:47 UTC
Update - We are continuing to work on a fix for this issue.
Aug 23, 2024 - 12:12 UTC
Update - Cosmos indexing is stopped after a recent hard fork. We are working on resolving it as soon as possible.
Aug 23, 2024 - 12:10 UTC
Identified - The issue has been identified and a fix is being implemented.
Aug 13, 2024 - 16:13 UTC

About This Site

Status, incident and maintenance information for The Graph.
Use the RSS feed together with https://zapier.com/apps/rss to have updates sent straight to Slack or Discord.

Network - Gateway ? Operational
90 days ago
100.0 % uptime
Today
Network - Explorer ? Operational
90 days ago
100.0 % uptime
Today
Network - Subgraph Studio ? Operational
90 days ago
99.98 % uptime
Today
Network - Network Subgraph ? Operational
90 days ago
100.0 % uptime
Today
Upgrade Indexer - Queries ? Operational
90 days ago
99.69 % uptime
Today
Upgrade Indexer - Subgraph Health Degraded Performance
90 days ago
99.79 % uptime
Today
Upgrade Indexer - Indexing Major Outage
90 days ago
97.43 % uptime
Today
Mainnet Operational
90 days ago
100.0 % uptime
Today
Sepolia Operational
90 days ago
100.0 % uptime
Today
Holesky Operational
90 days ago
100.0 % uptime
Today
Polygon (Matic) Operational
90 days ago
99.8 % uptime
Today
Polygon Amoy Operational
90 days ago
100.0 % uptime
Today
Polygon zkEvm Operational
90 days ago
100.0 % uptime
Today
Polygon zkEvm Testnet Operational
90 days ago
100.0 % uptime
Today
BSC Operational
90 days ago
99.97 % uptime
Today
BSC Chapel Operational
90 days ago
100.0 % uptime
Today
Avalanche Operational
90 days ago
100.0 % uptime
Today
Avalanche Fuji Operational
90 days ago
99.97 % uptime
Today
xDai Operational
90 days ago
100.0 % uptime
Today
Fantom Operational
90 days ago
100.0 % uptime
Today
Fantom Testnet Operational
90 days ago
100.0 % uptime
Today
Celo Operational
90 days ago
100.0 % uptime
Today
Celo Alfajores Operational
90 days ago
100.0 % uptime
Today
Fuse Operational
90 days ago
100.0 % uptime
Today
Moonbeam Operational
90 days ago
100.0 % uptime
Today
Moonriver Operational
90 days ago
99.63 % uptime
Today
Arbitrum One Operational
90 days ago
99.99 % uptime
Today
Arbitrum Sepolia Operational
90 days ago
100.0 % uptime
Today
Optimism Operational
90 days ago
100.0 % uptime
Today
Optimism Sepolia Operational
90 days ago
100.0 % uptime
Today
Aurora Operational
90 days ago
100.0 % uptime
Today
Aurora Testnet Operational
90 days ago
100.0 % uptime
Today
Arweave ? Operational
90 days ago
100.0 % uptime
Today
Cosmos Major Outage
90 days ago
22.9 % uptime
Today
Osmosis ? Operational
90 days ago
100.0 % uptime
Today
zkSync Era Operational
90 days ago
100.0 % uptime
Today
Zksync Era Sepolia Operational
90 days ago
100.0 % uptime
Today
Base Operational
90 days ago
100.0 % uptime
Today
Base Sepolia Operational
90 days ago
100.0 % uptime
Today
NEAR Operational
90 days ago
100.0 % uptime
Today
Near Testnet Operational
90 days ago
100.0 % uptime
Today
Harmony Operational
90 days ago
100.0 % uptime
Today
Scroll Mainnet Operational
90 days ago
100.0 % uptime
Today
Scroll Sepolia Operational
90 days ago
100.0 % uptime
Today
Linea Mainnet Operational
90 days ago
100.0 % uptime
Today
Linea Sepolia Operational
90 days ago
100.0 % uptime
Today
Blast Mainnet Operational
90 days ago
100.0 % uptime
Today
Blast Testnet Operational
90 days ago
100.0 % uptime
Today
Sei Testnet Operational
90 days ago
100.0 % uptime
Today
X Layer Mainnet Operational
90 days ago
100.0 % uptime
Today
X Layer Sepolia Operational
90 days ago
100.0 % uptime
Today
Theta Testnet Major Outage
90 days ago
33.82 % uptime
Today
Astar ZkEVM Mainnet Operational
90 days ago
100.0 % uptime
Today
Astar ZkEVM Sepolia Operational
90 days ago
100.0 % uptime
Today
Etherlink Testnet Operational
90 days ago
100.0 % uptime
Today
Gnosis Operational
90 days ago
100.0 % uptime
Today
Gnosis Chiado Operational
90 days ago
100.0 % uptime
Today
Boba Operational
90 days ago
100.0 % uptime
Today
Mode Mainnet Operational
90 days ago
100.0 % uptime
Today
Mode Testnet Operational
90 days ago
100.0 % uptime
Today
NeoX Operational
90 days ago
100.0 % uptime
Today
NeoX Testnet Operational
90 days ago
100.0 % uptime
Today
Bitcoin Substreams Operational
90 days ago
100.0 % uptime
Today
Sei Mainnet Operational
90 days ago
97.59 % uptime
Today
Upgrade Indexer - Miscellaneous Operational
90 days ago
99.96 % uptime
Today
Deployments ? Operational
90 days ago
99.8 % uptime
Today
IPFS Operational
90 days ago
100.0 % uptime
Today
Subgraph Logs ? Operational
90 days ago
100.0 % uptime
Today
Explorer ? Operational
90 days ago
100.0 % uptime
Today
thegraph.com ? Operational
90 days ago
100.0 % uptime
Today
The Graph Network Subgraph - Arbitrum Operational
90 days ago
100.0 % uptime
Today
Firehose Indexing Operational
90 days ago
100.0 % uptime
Today
Firehose Service Operational
90 days ago
100.0 % uptime
Today
Operational
Degraded Performance
Partial Outage
Major Outage
Maintenance
Major outage
Partial outage
No downtime recorded on this day.
No data exists for this day.
had a major outage.
had a partial outage.
Past Incidents
Oct 22, 2024

No incidents reported today.

Oct 21, 2024

No incidents reported.

Oct 20, 2024
Resolved - This incident has been resolved.
Oct 20, 01:39 UTC
Investigating - [FIRING:1] Osmosis-1: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Osmosis-1: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Osmosis-1: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivf1h90jka/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DOsmosis-1%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=109

Oct 18, 01:04 UTC
Oct 19, 2024

No incidents reported.

Oct 18, 2024
Resolved - This incident has been resolved.
Oct 18, 00:03 UTC
Investigating - [FIRING:1] Osmosis-1: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Osmosis-1: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Osmosis-1: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivf1h90jka/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DOsmosis-1%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=109

Oct 17, 12:04 UTC
Oct 17, 2024
Resolved - This incident has been resolved.
Oct 17, 20:02 UTC
Investigating - [FIRING:1] NeoX-Mainnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = NeoX-Mainnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = NeoX-Mainnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edw394pt9ajuoe/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DNeoX-Mainnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=163

Oct 17, 12:02 UTC
Resolved - This incident has been resolved.
Oct 17, 11:00 UTC
Investigating - [FIRING:1] NeoX-Mainnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = NeoX-Mainnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = NeoX-Mainnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edw394pt9ajuoe/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DNeoX-Mainnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=163

Oct 3, 16:07 UTC
Resolved - This incident has been resolved.
Oct 17, 11:00 UTC
Investigating - [FIRING:1] Osmosis-1: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Osmosis-1: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Osmosis-1: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivf1h90jka/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DOsmosis-1%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=109

Oct 11, 19:38 UTC
Oct 16, 2024
Resolved - This incident has been resolved.
Oct 16, 19:13 UTC
Investigating - [FIRING:1] Polygon (Matic): Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Polygon (Matic): Block ingestor lagging behind
- cmp_Polygon (Matic) = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- statuspage = Polygon (Matic): Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ZdG_62MVk/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPolygon+%28Matic%29%3A+Block+ingestor+lagging+behind&matcher=cmp_Polygon+%28Matic%29%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=61

Oct 16, 19:08 UTC
Oct 15, 2024
Resolved - This incident has been resolved.
Oct 15, 14:33 UTC
Investigating - [FIRING:1] Polygon (Matic): Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0, B1=32.35267981116357
Labels:
- alertname = Polygon (Matic): Block ingestor lagging behind
- cmp_Polygon (Matic) = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- statuspage = Polygon (Matic): Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ZdG_62MVk/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPolygon+%28Matic%29%3A+Block+ingestor+lagging+behind&matcher=cmp_Polygon+%28Matic%29%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=61

Oct 15, 13:38 UTC
Oct 14, 2024

No incidents reported.

Oct 13, 2024

No incidents reported.

Oct 12, 2024
Resolved - This incident has been resolved.
Oct 12, 09:07 UTC
Investigating - [FIRING:1] Etherlink-Sepolia: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Etherlink-Sepolia: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Etherlink-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivi6c5uyod/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DEtherlink-Sepolia%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=142

Oct 11, 22:18 UTC
Oct 11, 2024
Resolved - This incident has been resolved.
Oct 11, 20:58 UTC
Investigating - [FIRING:1] Polygon (Matic): Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=2.357678422660372, B1=0
Labels:
- alertname = Polygon (Matic): Block ingestor lagging behind
- cmp_Polygon (Matic) = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- statuspage = Polygon (Matic): Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ZdG_62MVk/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPolygon+%28Matic%29%3A+Block+ingestor+lagging+behind&matcher=cmp_Polygon+%28Matic%29%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=61

Oct 11, 19:28 UTC
Resolved - This incident has been resolved.
Oct 11, 19:23 UTC
Investigating - [FIRING:1] Osmosis-1: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Osmosis-1: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Osmosis-1: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivf1h90jka/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DOsmosis-1%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=109

Oct 3, 16:09 UTC
Resolved - This incident has been resolved.
Oct 11, 19:06 UTC
Investigating - [FIRING:1] Arbitrum-Sepolia: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Arbitrum-Sepolia: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Arbitrum-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivgjdk54wd/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DArbitrum-Sepolia%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=139

Oct 11, 18:46 UTC
Oct 10, 2024
Resolved - This incident has been resolved.
Oct 10, 22:38 UTC
Investigating - [FIRING:1] Polygon (Matic): Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0, B1=0
Labels:
- alertname = Polygon (Matic): Block ingestor lagging behind
- cmp_Polygon (Matic) = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- statuspage = Polygon (Matic): Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ZdG_62MVk/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPolygon+%28Matic%29%3A+Block+ingestor+lagging+behind&matcher=cmp_Polygon+%28Matic%29%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=61

Oct 10, 22:18 UTC
Resolved - This incident has been resolved.
Oct 10, 06:21 UTC
Investigating - [FIRING:1] Arbitrum-Sepolia: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Arbitrum-Sepolia: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Arbitrum-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivgjdk54wd/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DArbitrum-Sepolia%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=139

Oct 10, 06:16 UTC
Oct 9, 2024
Resolved - This incident has been resolved.
Oct 9, 20:08 UTC
Investigating - [FIRING:1] Polygon (Matic): Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=15.495695640099973
Labels:
- alertname = Polygon (Matic): Block ingestor lagging behind
- cmp_Polygon (Matic) = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- statuspage = Polygon (Matic): Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ZdG_62MVk/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPolygon+%28Matic%29%3A+Block+ingestor+lagging+behind&matcher=cmp_Polygon+%28Matic%29%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=61

Oct 9, 19:13 UTC
Oct 8, 2024
Resolved - This incident has been resolved.
Oct 8, 22:22 UTC
Identified - The Moonriver tracing nodes are down after the RT3100 enactment this morning - The Moonriver team is investigating.
Oct 8, 14:27 UTC
Resolved - This incident has been resolved.
Oct 8, 16:39 UTC
Investigating - [FIRING:1] Moonriver: Block ingestor lagging behind Production (partial_outage indexers P2-High true Tech Support)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Moonriver: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Tech Support
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Moonriver: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivhwz1kaoa/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DMoonriver%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DTech+Support&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=92

Oct 8, 15:19 UTC