en
Back to the list

Zilliqa Team Investigates Mainnet Disruption

source-logo  blockchainreporter.net 09 May 2024 14:31, UTC

The Zilliqa infrastructure team continues to work diligently to resolve the recent disruption to the mainnet block generation. Their preliminary analysis indicates that the issue may stem from a specific function returning a null value instead of a node’s RLP due to a potential invalid database lookup. This misbehavior results in an error being thrown, disrupting the network.

Zilfam! 📢The Zilliqa infrastructure team continues to work on resolving the ongoing disruption to mainnet block generation.

Preliminary analysis indicates this may be related to an issue within a specific function that returns a null value instead of a node's RLP following a… https://t.co/TkcNRhs4hj

— Zilliqa (@zilliqa) May 9, 2024

To tackle the issue head-on, the team has established internal rehearsal networks that aim to replicate the mainnet disruption. These isolated environments are designed to recreate the problem and trial potential bug fixes safely. Testing within this controlled setting ensures that proposed solutions undergo thorough evaluation before deployment to the Testnet and, eventually, the mainnet. The Zilliqa team is yet to share further updates and detailed steps for remediation as more information becomes available.

Initial Slowdown and Response

Approximately 20 hours prior, a slowdown in mainnet block generation was detected around 03:12 PM UTC. To diagnose further, the network’s transaction processing was halted at 04:00 PM UTC, giving the network time to clear the transactions in the mempool and self-recover. After a 30-minute pause, normal operation resumed.

Zilfam! 📢 Around 03:12 PM UTC, a slowdown in mainnet block generation was detected.

To delve deeper, transaction processing was halted at 04:00 PM UTC, permitting the network to clear the transactions in the mempool and self-recover. After 30 minutes, normal operation resumed.…

— Zilliqa (@zilliqa) May 8, 2024

The team awaited a new epoch to achieve full data synchronization before re-enabling transactions, which was expected within 20 minutes. During this time, they monitored the continued functionality of block generation to ensure stability. The team remains committed to identifying the root cause of the disruption and keeping the community informed with regular updates.

blockchainreporter.net