Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimize block finalization #3347

Merged
merged 2 commits into from
Jan 14, 2025
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 19 additions & 1 deletion crates/sc-consensus-subspace/src/archiver.rs
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,9 @@ use rand::prelude::*;
use rand_chacha::ChaCha8Rng;
use rayon::prelude::*;
use rayon::ThreadPoolBuilder;
use sc_client_api::{AuxStore, Backend as BackendT, BlockBackend, Finalizer, LockImportRun};
use sc_client_api::{
AuxStore, Backend as BackendT, BlockBackend, BlockchainEvents, Finalizer, LockImportRun,
};
use sc_telemetry::{telemetry, TelemetryHandle, CONSENSUS_INFO};
use sc_utils::mpsc::{tracing_unbounded, TracingUnboundedSender};
use sp_api::ProvideRuntimeApi;
Expand Down Expand Up @@ -916,6 +918,7 @@ where
+ HeaderBackend<Block>
+ LockImportRun<Block, Backend>
+ Finalizer<Block, Backend>
+ BlockchainEvents<Block>
+ AuxStore
+ Send
+ Sync
Expand Down Expand Up @@ -1087,6 +1090,21 @@ where
.filter(|block_number| *block_number > client.info().finalized_number);

if let Some(block_number_to_finalize) = maybe_block_number_to_finalize {
{
let mut import_notification = client.every_import_notification_stream();

// Drop notification to drop acknowledgement and allow block import to
// proceed
drop(block_importing_notification);

while let Some(notification) = import_notification.next().await {
// Wait for importing block to finish importing
if notification.header.number() == &importing_block_number {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should it be notification.header.number() >= &importing_block_number? otherwise, there can be a race condition where the block is imported before we call client.every_import_notification_stream().

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We call this from the context of block import, so for practical purposes Substrate right mow will only import one block at a time. But even if not, it doesn't really matter because all we really try to do here is delay finalization, it will work fine even if we don't, just causes unnecessary delays in block propagation as described in #3321

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, I see, the acknowledgement in the block notification will avoid such race condition happening. Though, my concern is more about the archiver being stuck in this loop... Perhaps instead of block_importing_notification_stream, we should use every_import_notification_stream to derive the archiver forward so we don't need to wait for block import to finish.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are using every_import_notification_stream though or am I missing something?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean to replace this stream with every_import_notification_stream:

// Subscribing synchronously before returning
let mut block_importing_notification_stream = subspace_link
.block_importing_notification_stream
.subscribe();

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is importing, not import stream. It is already fired for every single block before it is imported. We can't replace it with stream that processes blocks after import without breaking archiving.

break;
}
}
}

// Block is not guaranteed to be present this deep if we have only synced recent
// blocks
if let Some(block_hash_to_finalize) =
Expand Down
Loading