The Graph
Monitoring - A fix has been implemented and we are monitoring the results.
If we don't see any issues within 24 hours, we will be closing this alert.

Jul 26, 2024 - 09:14 UTC
Investigating - We are currently investigating this issue.
Jul 25, 2024 - 21:02 UTC
Identified - The issue has been identified and a fix is being implemented.
Jul 24, 2024 - 16:22 UTC

About This Site

Status, incident and maintenance information for The Graph.
Use the RSS feed together with https://zapier.com/apps/rss to have updates sent straight to Slack or Discord.

Network - Gateway ? Degraded Performance
90 days ago
99.83 % uptime
Today
Network - Explorer ? Operational
90 days ago
99.83 % uptime
Today
Network - Subgraph Studio ? Operational
90 days ago
99.7 % uptime
Today
Network - Network Subgraph ? Operational
90 days ago
99.83 % uptime
Today
Upgrade Indexer - Queries ? Operational
90 days ago
99.7 % uptime
Today
Upgrade Indexer - Subgraph Health Operational
90 days ago
99.83 % uptime
Today
Upgrade Indexer - Indexing Operational
90 days ago
96.68 % uptime
Today
Mainnet Operational
90 days ago
99.23 % uptime
Today
Sepolia Operational
90 days ago
99.23 % uptime
Today
Holesky Operational
90 days ago
99.23 % uptime
Today
Polygon (Matic) Operational
90 days ago
99.92 % uptime
Today
Polygon Amoy Operational
90 days ago
100.0 % uptime
Today
Polygon zkEvm Operational
90 days ago
98.56 % uptime
Today
Polygon zkEvm Testnet Operational
90 days ago
99.33 % uptime
Today
BSC Operational
90 days ago
99.23 % uptime
Today
BSC Chapel Operational
90 days ago
100.0 % uptime
Today
Avalanche Operational
90 days ago
100.0 % uptime
Today
Avalanche Fuji Operational
90 days ago
100.0 % uptime
Today
xDai Operational
90 days ago
100.0 % uptime
Today
Fantom Operational
90 days ago
100.0 % uptime
Today
Fantom Testnet Operational
90 days ago
100.0 % uptime
Today
Celo Operational
90 days ago
99.23 % uptime
Today
Celo Alfajores Operational
90 days ago
100.0 % uptime
Today
Fuse Operational
90 days ago
100.0 % uptime
Today
Moonbeam Operational
90 days ago
100.0 % uptime
Today
Moonriver Operational
90 days ago
100.0 % uptime
Today
Arbitrum One Operational
90 days ago
99.23 % uptime
Today
Arbitrum Sepolia Operational
90 days ago
100.0 % uptime
Today
Optimism Operational
90 days ago
100.0 % uptime
Today
Optimism Sepolia Operational
90 days ago
100.0 % uptime
Today
Aurora Operational
90 days ago
100.0 % uptime
Today
Aurora Testnet Operational
90 days ago
100.0 % uptime
Today
Arweave ? Operational
90 days ago
99.23 % uptime
Today
Cosmos Operational
90 days ago
78.94 % uptime
Today
Osmosis ? Operational
90 days ago
86.78 % uptime
Today
zkSync Era Operational
90 days ago
100.0 % uptime
Today
Zksync Era Sepolia Operational
90 days ago
100.0 % uptime
Today
Base Operational
90 days ago
57.35 % uptime
Today
Base Sepolia Operational
90 days ago
10.0 % uptime
Today
NEAR Operational
90 days ago
99.23 % uptime
Today
Near Testnet Operational
90 days ago
99.23 % uptime
Today
Harmony Operational
90 days ago
100.0 % uptime
Today
Scroll Mainnet Operational
90 days ago
100.0 % uptime
Today
Scroll Sepolia Operational
90 days ago
100.0 % uptime
Today
Linea Mainnet Operational
90 days ago
100.0 % uptime
Today
Linea Sepolia Operational
90 days ago
100.0 % uptime
Today
Blast Mainnet Operational
90 days ago
100.0 % uptime
Today
Blast Testnet Operational
90 days ago
99.46 % uptime
Today
Sei Testnet Operational
90 days ago
100.0 % uptime
Today
X Layer Mainnet Operational
90 days ago
100.0 % uptime
Today
X Layer Sepolia Operational
90 days ago
100.0 % uptime
Today
Theta Testnet Operational
90 days ago
99.23 % uptime
Today
Astar ZkEVM Mainnet Operational
90 days ago
100.0 % uptime
Today
Astar ZkEVM Sepolia Operational
90 days ago
100.0 % uptime
Today
Etherlink Testnet Operational
90 days ago
100.0 % uptime
Today
Gnosis Operational
90 days ago
100.0 % uptime
Today
Gnosis Chiado Operational
90 days ago
100.0 % uptime
Today
Boba Operational
90 days ago
99.23 % uptime
Today
Mode Mainnet Operational
90 days ago
100.0 % uptime
Today
Mode Testnet Operational
90 days ago
100.0 % uptime
Today
Bitcoin Substreams Operational
90 days ago
99.23 % uptime
Today
Upgrade Indexer - Miscellaneous Operational
90 days ago
99.43 % uptime
Today
Deployments ? Operational
90 days ago
96.6 % uptime
Today
IPFS Operational
90 days ago
100.0 % uptime
Today
Subgraph Logs ? Operational
90 days ago
100.0 % uptime
Today
Explorer ? Operational
90 days ago
100.0 % uptime
Today
thegraph.com ? Operational
90 days ago
100.0 % uptime
Today
The Graph Network Subgraph - Arbitrum Operational
90 days ago
100.0 % uptime
Today
Firehose Indexing Degraded Performance
90 days ago
99.83 % uptime
Today
Firehose Service Degraded Performance
90 days ago
99.83 % uptime
Today
Operational
Degraded Performance
Partial Outage
Major Outage
Maintenance
Major outage
Partial outage
No downtime recorded on this day.
No data exists for this day.
had a major outage.
had a partial outage.
Past Incidents
Jul 27, 2024

No incidents reported today.

Jul 26, 2024
Resolved - This incident has been resolved.
Jul 26, 10:55 UTC
Update - [Comment from Opsgenie]muhammad acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 26, 08:19 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.617470983506
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 26, 08:18 UTC
Resolved - This incident has been resolved.
Jul 26, 10:54 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Sei-Testnet: Block ingestor lagging behind"
Jul 26, 09:39 UTC
Investigating - [FIRING:1] Sei-Testnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Sei-Testnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Sei-Testnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivf67241sc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DSei-Testnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 26, 09:39 UTC
Resolved - This incident has been resolved.
Jul 26, 10:54 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Theta-Testnet-001: Block ingestor lagging behind"
Jul 26, 10:52 UTC
Investigating - [FIRING:1] Theta-Testnet-001: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Theta-Testnet-001: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 26, 10:52 UTC
Resolved - This incident has been resolved.
Jul 26, 10:21 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Theta-Testnet-001: Block ingestor lagging behind"
Jul 26, 10:18 UTC
Investigating - [FIRING:1] Theta-Testnet-001: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Theta-Testnet-001: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 26, 10:18 UTC
Resolved - This incident has been resolved.
Jul 26, 09:48 UTC
Investigating - [FIRING:1] Theta-Testnet-001: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Theta-Testnet-001: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 26, 09:43 UTC
Resolved - This incident has been resolved.
Jul 26, 09:08 UTC
Update - [Comment from Opsgenie]Yash Jagtap acknowledged alert: "[Grafana]: Theta-Testnet-001: Block ingestor lagging behind"
Jul 26, 09:08 UTC
Investigating - [FIRING:1] Theta-Testnet-001: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Theta-Testnet-001: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 26, 09:08 UTC
Resolved - This incident has been resolved.
Jul 26, 09:07 UTC
Investigating - [FIRING:1] Sei-Testnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Sei-Testnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Sei-Testnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivf67241sc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DSei-Testnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 26, 09:04 UTC
Resolved - This incident has been resolved.
Jul 26, 08:07 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 26, 00:19 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.499553571429
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 26, 00:18 UTC
Resolved - This incident has been resolved.
Jul 26, 08:07 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Theta-Testnet-001: Block ingestor lagging behind"
Jul 26, 01:08 UTC
Investigating - [FIRING:1] Theta-Testnet-001: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Theta-Testnet-001: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 26, 01:08 UTC
Resolved - This incident has been resolved.
Jul 26, 08:04 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Sei-Testnet: Block ingestor lagging behind"
Jul 26, 01:12 UTC
Investigating - [FIRING:1] Sei-Testnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Sei-Testnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Sei-Testnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivf67241sc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DSei-Testnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 26, 01:09 UTC
Resolved - This incident has been resolved.
Jul 26, 00:06 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 25, 20:18 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.59687756779
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 25, 20:18 UTC
Resolved - This incident has been resolved.
Jul 26, 00:06 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Sei-Testnet: Block ingestor lagging behind"
Jul 25, 17:04 UTC
Investigating - [FIRING:1] Sei-Testnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Sei-Testnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Sei-Testnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivf67241sc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DSei-Testnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 25, 17:04 UTC
Resolved - This incident has been resolved.
Jul 26, 00:06 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Theta-Testnet-001: Block ingestor lagging behind"
Jul 25, 17:43 UTC
Investigating - [FIRING:1] Theta-Testnet-001: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Theta-Testnet-001: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 25, 17:43 UTC
Jul 25, 2024
Resolved - This incident has been resolved.
Jul 25, 20:07 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 25, 17:47 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.767052883098
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 25, 17:43 UTC
Resolved - This incident has been resolved.
Jul 25, 17:19 UTC
Update - [Comment from Opsgenie]muhammad acknowledged alert: "[Grafana]: DatasourceNoData


36 resolved alert(s):

DatasourceNoData


DatasourceNoData


DatasourceNoData


DatasourceNoData


D"

Jul 25, 17:14 UTC
Investigating - [FIRING:1, RESOLVED:36] DatasourceNoData Production (indexers c448b23e-d637-4e71-a420-d83567efd033 P2-High A true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Fuji: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Fuji: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivh1tt534a/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DFuji%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1


**Resolved**

Value: [no value]
Labels:
- alertname = DatasourceNoData
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Celo-Alfajores: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Celo-Alfajores: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivfxsxpmof/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DCelo-Alfajores%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp_Polygon (Matic) = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Polygon (Matic): Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Polygon (Matic): Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ZdG_62MVk/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp_Polygon+%28Matic%29%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DPolygon+%28Matic%29%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=61

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = xDai: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = xDai: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivf32oohsc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DxDai%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Optimism-Sepolia: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Optimism-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivf7zzh1ce/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DOptimism-Sepolia%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Scroll-Sepolia: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Scroll-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivgjg21hce/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DScroll-Sepolia%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Moonbase-Alpha: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Moonbase-Alpha: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivhafgjy8c/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DMoonbase-Alpha%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Linea: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Linea: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivhrhrkzkb/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DLinea%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Mode-Mainnet: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Mode-Mainnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivht871moc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DMode-Mainnet%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = NEAR-Testnet: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = NEAR-Testnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivijl4gzkb/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DNEAR-Testnet%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Optimism: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Optimism: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/bdjeivf622bcwb/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DOptimism%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Harmony: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Harmony: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/bdjeivgdwa5tsc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DHarmony%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = BSC: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = BSC: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/bdjeivgiykr28f/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DBSC%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Arweave-Mainnet: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Arweave-Mainnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/bdjeividov3lsc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DArweave-Mainnet%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Base-Sepolia: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Base-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivf5alfk0a/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&m

Jul 25, 17:14 UTC
Resolved - This incident has been resolved.
Jul 25, 17:13 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 25, 17:08 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.633387613005, C1=0.563021278503
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 25, 17:08 UTC
Resolved - This incident has been resolved.
Jul 25, 17:12 UTC
Update - [Comment from Opsgenie]muhammad acknowledged alert: "[Grafana]: DatasourceNoData


DatasourceNoData


DatasourceNoData


DatasourceNoData"

Jul 25, 15:55 UTC
Investigating - [FIRING:4] DatasourceNoData Production (partial_outage indexers c448b23e-d637-4e71-a420-d83567efd033 P2-High A true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Moonbase-Alpha: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Moonbase-Alpha: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivhafgjy8c/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DMoonbase-Alpha%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Scroll: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Scroll: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivh6w3qbkc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DScroll%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Matic: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Matic: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivf84z9q8e/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DMatic%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Cosmos-Hub-4: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Cosmos-Hub-4: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivh5d5ypsc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DCosmos-Hub-4%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 25, 15:54 UTC
Resolved - This incident has been resolved.
Jul 25, 17:12 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Theta-Testnet-001: Block ingestor lagging behind"
Jul 25, 17:08 UTC
Investigating - [FIRING:1] Theta-Testnet-001: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Theta-Testnet-001: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 25, 17:08 UTC
Resolved - This incident has been resolved.
Jul 25, 16:55 UTC
Investigating - We are currently investigating this issue.
Jul 25, 16:29 UTC
Resolved - This incident has been resolved.
Jul 25, 16:28 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 25, 16:18 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.865856753156
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 25, 16:18 UTC
Resolved - This incident has been resolved.
Jul 25, 16:02 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 24, 16:28 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.491537376587
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 24, 16:27 UTC
Resolved - This incident has been resolved.
Jul 25, 16:02 UTC
Update - [Comment from Opsgenie]muhammad acknowledged alert: "[Grafana]: Theta-Testnet-001: Block ingestor lagging behind"
Jul 24, 16:58 UTC
Investigating - [FIRING:1] Theta-Testnet-001: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Theta-Testnet-001: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 24, 16:57 UTC
Jul 24, 2024
Resolved - This incident has been resolved.
Jul 24, 18:03 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Polygon (Matic): Block ingestor lagging behind"
Jul 24, 17:33 UTC
Investigating - [FIRING:1] Polygon (Matic): Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=28
Labels:
- alertname = Polygon (Matic): Block ingestor lagging behind
- cmp_Polygon (Matic) = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Polygon (Matic): Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ZdG_62MVk/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPolygon+%28Matic%29%3A+Block+ingestor+lagging+behind&matcher=cmp_Polygon+%28Matic%29%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=61

Jul 24, 17:33 UTC
Resolved - There are no reports of issues, and things look stable. We will close out the alert in 24 hours if things continue to look good.
Jul 24, 16:27 UTC
Monitoring - A fix has been implemented and we are monitoring the results.
Jul 24, 08:35 UTC
Update - We are actively investigating and debugging the 'Internal Server Errors' while updating the IPFS Kubo nodes. Our team is closely monitoring the situation and will provide a further update in the coming hours.
Jul 24, 06:10 UTC
Update - Some subgraph deployments and IPFS requests are going through at this time, although we're still seeing a large volume of "500 internal server errors". The team is working around the clock to continue debugging this. We will provide another update in the coming hours as we continue to make progress.
Jul 24, 01:17 UTC
Update - We first identified a major issue with our IPFS instance on July 18th and began troubleshooting immediately. Our devops team worked over the weekend to determine whether the root cause was related to our IPFS configurations or our internal deployment router.

The team increased timeouts, restarted services, and blocked IPs, but the issue persisted. We have now pulled in IPFS experts to help resolve the issue, and are in the process of reconfiguring our IPFS kubo nodes, improving node peering, and updating our load balancing services. We are monitoring the situation and will provide another update in the coming hours.

Jul 23, 20:17 UTC
Identified - The issue has been identified and a fix is being implemented.
Jul 22, 14:31 UTC
Monitoring - The issue has been identified and a solution is being implemented.
Jul 22, 10:38 UTC
Identified - The issue has been identified and a fix is being implemented.
Jul 18, 10:16 UTC
Resolved - Created new incident for the firehose outage.
Jul 24, 16:23 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Theta-Testnet-001: Block ingestor lagging behind"
Jul 24, 15:23 UTC
Investigating - [FIRING:1] Theta-Testnet-001: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Theta-Testnet-001: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 24, 15:23 UTC
Resolved - This incident has been resolved.
Jul 24, 08:43 UTC
Update - [Comment from Opsgenie]muhammad acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 23, 20:29 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.232087306849
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 23, 20:28 UTC
Resolved - This incident has been resolved.
Jul 24, 07:13 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Etherlink-Sepolia: Block ingestor lagging behind"
Jul 24, 05:38 UTC
Investigating - [FIRING:1] Etherlink-Sepolia: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=9
Labels:
- alertname = Etherlink-Sepolia: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Etherlink-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivi6c5uyod/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DEtherlink-Sepolia%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 24, 05:38 UTC
Jul 23, 2024
Resolved - This incident has been resolved.
Jul 23, 18:52 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 23, 14:48 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.1504902401727
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 23, 14:48 UTC
Resolved - This incident has been resolved.
Jul 23, 04:28 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Osmosis-1: Block ingestor lagging behind"
Jul 23, 01:04 UTC
Investigating - [FIRING:1] Osmosis-1: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Osmosis-1: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Osmosis-1: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivf1h90jka/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DOsmosis-1%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 23, 00:04 UTC
Jul 22, 2024
Resolved - This incident has been resolved.
Jul 22, 23:04 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Osmosis-1: Block ingestor lagging behind"
Jul 21, 03:14 UTC
Investigating - [FIRING:1] Osmosis-1: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Osmosis-1: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Osmosis-1: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivf1h90jka/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DOsmosis-1%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 21, 03:14 UTC
Resolved - This incident has been resolved.
Jul 22, 14:25 UTC
Investigating - [FIRING:1] DatasourceNoData Production (partial_outage indexers c448b23e-d637-4e71-a420-d83567efd033 P2-High A NEAR-Testnet: Block ingestor lagging behind true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = NEAR-Testnet: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = NEAR-Testnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivijl4gzkb/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DNEAR-Testnet%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 22, 13:26 UTC
Resolved - This incident has been resolved.
Jul 22, 14:03 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Polygon (Matic): Block ingestor lagging behind"
Jul 22, 13:33 UTC
Investigating - [FIRING:1] Polygon (Matic): Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=54.20710059171598
Labels:
- alertname = Polygon (Matic): Block ingestor lagging behind
- cmp_Polygon (Matic) = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Polygon (Matic): Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ZdG_62MVk/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPolygon+%28Matic%29%3A+Block+ingestor+lagging+behind&matcher=cmp_Polygon+%28Matic%29%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=61

Jul 22, 13:23 UTC
Resolved - This incident has been resolved.
Jul 22, 13:35 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: NEAR-Testnet: Block ingestor lagging behind"
Jul 21, 03:21 UTC
Investigating - [FIRING:1] NEAR-Testnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = NEAR-Testnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = NEAR-Testnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivijl4gzkb/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DNEAR-Testnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 21, 03:21 UTC
Resolved - This incident has been resolved.
Jul 22, 12:32 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Theta-Testnet-001: Block ingestor lagging behind"
Jul 21, 03:18 UTC
Investigating - [FIRING:1] Theta-Testnet-001: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Theta-Testnet-001: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 21, 03:18 UTC
Jul 21, 2024
Resolved - This incident has been resolved.
Jul 21, 08:56 UTC
Identified - You may find that your subgraph indexing BTC, Arbitrum One, Theta Testnet, Cosmus Hub, Base, Osmosis, Polygon zkevm, Arweave, Near mainnet, Near Testnet Binance Smart Chain, Holesky, Ethereum Mainnet, Sepolia are lagging behind. We are working on resolving it.
Jul 20, 16:24 UTC
Investigating - You may find that your subgraph indexing BTC, Arbitrum One, Theta Testnet, Cosmus Hub, Base, Osmosis, Polygon zkevm, Arweave, Near mainnet, Near Testnet Binance Smart Chain, Holesky, Ethereum Mainnet, Sepolia are lagging behind. We are working on resolving it.
Jul 20, 16:24 UTC
Resolved - This incident has been resolved.
Jul 21, 01:11 UTC
Update - [Comment from Opsgenie]ruslan acknowledged alert: "[Grafana]: Base: Block ingestor lagging behind"
Jul 20, 16:33 UTC
Investigating - [FIRING:1] Base: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Base: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Base: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivil1kc8we/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DBase%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:32 UTC
Resolved - This incident has been resolved.
Jul 21, 01:08 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: XLayer-Sepolia: Block ingestor lagging behind"
Jul 21, 00:32 UTC
Investigating - [FIRING:1] XLayer-Sepolia: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = XLayer-Sepolia: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = XLayer-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/fdjeivgy2ymf4e/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DXLayer-Sepolia%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 21, 00:28 UTC
Resolved - This incident has been resolved.
Jul 21, 01:08 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Etherlink-Sepolia: Block ingestor lagging behind"
Jul 21, 00:33 UTC
Investigating - [FIRING:1] Etherlink-Sepolia: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Etherlink-Sepolia: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Etherlink-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivi6c5uyod/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DEtherlink-Sepolia%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 21, 00:33 UTC
Resolved - This incident has been resolved.
Jul 21, 01:07 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Boba: Block ingestor lagging behind"
Jul 21, 00:32 UTC
Investigating - [FIRING:1] Boba: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Boba: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Boba: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/fdjeivhz9g7pca/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DBoba%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 21, 00:32 UTC
Resolved - This incident has been resolved.
Jul 21, 01:06 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Arbitrum-Sepolia: Block ingestor lagging behind"
Jul 21, 00:32 UTC
Investigating - [FIRING:1] Arbitrum-Sepolia: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Arbitrum-Sepolia: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Arbitrum-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivgjdk54wd/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DArbitrum-Sepolia%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 21, 00:31 UTC
Resolved - This incident has been resolved.
Jul 21, 01:04 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Moonriver: Block ingestor lagging behind"
Jul 21, 00:32 UTC
Investigating - [FIRING:1] Moonriver: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Moonriver: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Moonriver: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivhwz1kaoa/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DMoonriver%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 21, 00:29 UTC
Resolved - This incident has been resolved.
Jul 21, 01:03 UTC
Update - [Comment from Opsgenie]ruslan acknowledged alert: "[Grafana]: Polygon-zkEVM: Block ingestor lagging behind"
Jul 20, 16:33 UTC
Investigating - [FIRING:1] Polygon-zkEVM: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Polygon-zkEVM: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Polygon-zkEVM: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/fdjeivgbjdm2oe/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPolygon-zkEVM%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:28 UTC
Resolved - This incident has been resolved.
Jul 21, 00:05 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Mainnet: Block ingestor lagging behind"
Jul 20, 21:31 UTC
Investigating - [FIRING:1] Mainnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Mainnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Mainnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/fdjeivfrf71tsb/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DMainnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 21:31 UTC
Jul 20, 2024
Resolved - This incident has been resolved.
Jul 20, 17:56 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: NEAR-Testnet: Block ingestor lagging behind"
Jul 20, 16:59 UTC
Investigating - [FIRING:1] NEAR-Testnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = NEAR-Testnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = NEAR-Testnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/adjeivijl4gzkb/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DNEAR-Testnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:56 UTC
Resolved - This incident has been resolved.
Jul 20, 17:54 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Cosmos-Hub-4: Block ingestor lagging behind"
Jul 20, 16:55 UTC
Investigating - [FIRING:1] Cosmos-Hub-4: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Cosmos-Hub-4: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Cosmos-Hub-4: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivh5d5ypsc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DCosmos-Hub-4%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:54 UTC
Resolved - This incident has been resolved.
Jul 20, 17:53 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Theta-Testnet-001: Block ingestor lagging behind"
Jul 20, 16:55 UTC
Investigating - [FIRING:1] Theta-Testnet-001: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Theta-Testnet-001: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:53 UTC
Resolved - This incident has been resolved.
Jul 20, 17:52 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Arweave-Mainnet: Block ingestor lagging behind"
Jul 20, 16:55 UTC
Investigating - [FIRING:1] Arweave-Mainnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Arweave-Mainnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Arweave-Mainnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/bdjeividov3lsc/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DArweave-Mainnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:52 UTC
Resolved - This incident has been resolved.
Jul 20, 17:51 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Arbitrum-One: Block ingestor lagging behind"
Jul 20, 16:59 UTC
Investigating - [FIRING:1] Arbitrum-One: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Arbitrum-One: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Arbitrum-One: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivf9xwmpsf/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DArbitrum-One%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:56 UTC
Resolved - This incident has been resolved.
Jul 20, 17:51 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Celo: Block ingestor lagging behind"
Jul 20, 16:59 UTC
Investigating - [FIRING:1] Celo: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Celo: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Celo: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivigqqmtdb/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DCelo%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:56 UTC
Resolved - This incident has been resolved.
Jul 20, 17:51 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Mainnet: Block ingestor lagging behind"
Jul 20, 16:51 UTC
Investigating - [FIRING:1] Mainnet: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Mainnet: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Mainnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/fdjeivfrf71tsb/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DMainnet%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:51 UTC
Resolved - This incident has been resolved.
Jul 20, 17:50 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Holesky: Block ingestor lagging behind"
Jul 20, 16:55 UTC
Investigating - [FIRING:1] Holesky: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Holesky: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Holesky: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivfpyr6kgb/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHolesky%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:55 UTC
Resolved - This incident has been resolved.
Jul 20, 17:49 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Sepolia: Block ingestor lagging behind"
Jul 20, 16:55 UTC
Investigating - [FIRING:1] Sepolia: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Sepolia: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/fdjeivgj62g3kd/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DSepolia%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:54 UTC
Resolved - This incident has been resolved.
Jul 20, 17:48 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: BSC: Block ingestor lagging behind"
Jul 20, 16:55 UTC
Investigating - [FIRING:1] BSC: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = BSC: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = BSC: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/bdjeivgiykr28f/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DBSC%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:53 UTC
Resolved - This incident has been resolved.
Jul 20, 17:24 UTC
Update - [Comment from Opsgenie]ruslan acknowledged alert: "[Grafana]: Osmosis-1: Block ingestor lagging behind"
Jul 20, 16:33 UTC
Investigating - [FIRING:1] Osmosis-1: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = Osmosis-1: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Osmosis-1: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivf1h90jka/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DOsmosis-1%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 16:29 UTC
Resolved - This incident has been resolved.
Jul 20, 16:04 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: DatasourceNoData


DatasourceNoData"

Jul 20, 15:44 UTC
Investigating - [FIRING:2] DatasourceNoData Production (partial_outage indexers c448b23e-d637-4e71-a420-d83567efd033 P2-High A true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Moonriver: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Moonriver: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivhwz1kaoa/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DMoonriver%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Sepolia: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/fdjeivgj62g3kd/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DSepolia%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 20, 15:44 UTC
Jul 19, 2024

No incidents reported.

Jul 18, 2024
Jul 17, 2024
Resolved - This incident has been resolved.
Jul 17, 14:38 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 17, 14:33 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.944910015112
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 17, 14:28 UTC
Resolved - This incident has been resolved.
Jul 17, 14:08 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 17, 13:58 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.932499097581
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 17, 13:58 UTC
Jul 16, 2024
Resolved - This incident has been resolved.
Jul 16, 05:43 UTC
Update - [Comment from Opsgenie]Ruslan Rotaru acknowledged alert: "[Grafana]: XLayer-Sepolia: Block ingestor lagging behind"
Jul 15, 05:44 UTC
Investigating - [FIRING:1] XLayer-Sepolia: Block ingestor lagging behind Production (partial_outage indexers P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: B0=0
Labels:
- alertname = XLayer-Sepolia: Block ingestor lagging behind
- cmp = partial_outage
- component = indexers
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = XLayer-Sepolia: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/fdjeivgy2ymf4e/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DXLayer-Sepolia%3A+Block+ingestor+lagging+behind&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 15, 04:43 UTC
Jul 15, 2024
Jul 14, 2024

No incidents reported.

Jul 13, 2024
Resolved - This incident has been resolved.
Jul 13, 23:05 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: DatasourceNoData


DatasourceNoData"

Jul 13, 22:45 UTC
Investigating - [FIRING:2] DatasourceNoData Production (indexers c448b23e-d637-4e71-a420-d83567efd033 P2-High A true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: [no value]
Labels:
- alertname = DatasourceNoData
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Celo-Alfajores: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Celo-Alfajores: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/cdjeivfxsxpmof/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DCelo-Alfajores%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Fantom-Testnet: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Fantom-Testnet: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/fdjeivfrk6uiod/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DFantom-Testnet%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 13, 22:45 UTC
Resolved - This incident has been resolved.
Jul 13, 21:54 UTC
Update - [Comment from Opsgenie]muhammad acknowledged alert: "[Grafana]: DatasourceNoData


DatasourceNoData"

Jul 13, 18:35 UTC
Investigating - [FIRING:2] DatasourceNoData Production (partial_outage indexers c448b23e-d637-4e71-a420-d83567efd033 P2-High A true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Theta-Testnet-001: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Theta-Testnet-001: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/edjeivhuenbi8d/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DTheta-Testnet-001%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Gnosis: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Gnosis: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/fdjeivgoauxoge/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DGnosis%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 13, 18:33 UTC
Resolved - This incident has been resolved.
Jul 13, 17:50 UTC
Update - [Comment from Opsgenie]muhammad acknowledged alert: "[Grafana]: DatasourceNoData"
Jul 13, 17:33 UTC
Investigating - [FIRING:1] DatasourceNoData Production (partial_outage indexers c448b23e-d637-4e71-a420-d83567efd033 P2-High A Holesky: Block ingestor lagging behind true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: [no value]
Labels:
- alertname = DatasourceNoData
- cmp = partial_outage
- component = indexers
- datasource_uid = c448b23e-d637-4e71-a420-d83567efd033
- grafana_folder = Production
- priority = P2-High
- ref_id = A
- rulename = Holesky: Block ingestor lagging behind
- statuspage = true
- team = Infrastructure
Annotations:
- description = Triggered when the block ingestor is lagging
- statuspage = Holesky: Block ingestor lagging behind
Source: https://thegraph.grafana.net/alerting/grafana/ddjeivfpyr6kgb/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDatasourceNoData&matcher=cmp%3Dpartial_outage&matcher=component%3Dindexers&matcher=datasource_uid%3Dc448b23e-d637-4e71-a420-d83567efd033&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=ref_id%3DA&matcher=rulename%3DHolesky%3A+Block+ingestor+lagging+behind&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1

Jul 13, 17:30 UTC
Resolved - This incident has been resolved.
Jul 13, 13:08 UTC
Update - [Comment from Opsgenie]Muhammad Perreira acknowledged alert: "[Grafana]: Hosted Service - Queries: too many failed requests"
Jul 13, 11:48 UTC
Investigating - [FIRING:1] Hosted Service - Queries: too many failed requests Production (P2-High true Infrastructure)
https://thegraph.grafana.net/alerting/list

**Firing**

Value: C0=0.7596404755
Labels:
- alertname = Hosted Service - Queries: too many failed requests
- grafana_folder = Production
- priority = P2-High
- statuspage = true
- team = Infrastructure
Annotations:
- statuspage = Hosted Service - Queries: too many failed requests
Source: https://thegraph.grafana.net/alerting/grafana/9rM_ehMVz/view?orgId=1
Silence: https://thegraph.grafana.net/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DHosted+Service+-+Queries%3A+too+many+failed+requests&matcher=grafana_folder%3DProduction&matcher=priority%3DP2-High&matcher=statuspage%3Dtrue&matcher=team%3DInfrastructure&orgId=1
Dashboard: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1
Panel: https://thegraph.grafana.net/d/7rcuDImZk?orgId=1&viewPanel=89

Jul 13, 11:48 UTC