Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix scale support for upcoming versions #313

Closed
wants to merge 2 commits into from
Closed

Conversation

pgherveou
Copy link

@pgherveou pgherveou commented Sep 12, 2023

This PR should fix breaking changes for upcoming parity-scale-codec versions

Motivation

We have added some constraints to the HasCompact trait in paritytech/parity-scale-codec#512
to fix MaxEncodedLen for compact fields.

/// Trait that tells you if a given type can be encoded/decoded in a compact way.
pub trait HasCompact: Sized {
	/// The compact type; this can be
-	type Type: for<'a> EncodeAsRef<'a, Self> + Decode + From<Self> + Into<Self>;
+	type Type: for<'a> EncodeAsRef<'a, Self> + Decode + From<Self> + Into<Self> + MaybeMaxEncodedLen;
}

This is so that we can fix the MaxEncodedLen derive trait for Compact field

- ty.span() => .saturating_add(<#crate_path::Compact::<#ty> as #crate_path::MaxEncodedLen>::max_encoded_len())
+ ty.span() => .saturating_add(<<#ty as #crate_path::HasCompact>::Type as #crate_path::MaxEncodedLen>::max_encoded_len())

This change breaks uint, thus this PR to attempt to fix it.

Solution

This PR implements MaxEncodedLen for CompactUint (and Encode as required) to fix the compilation issue with upcoming versions

PR Checklist

  • Added Tests
  • Added Documentation
  • Updated the changelog

Comment on lines +84 to +89
impl<const BITS: usize, const LIMBS: usize> Encode for CompactUint<BITS, LIMBS> {
fn encode_to<T: Output + ?Sized>(&self, dest: &mut T) {
let v: CompactRefUint<BITS, LIMBS> = (&self.0).into();
v.encode_to(dest)
}
}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I need to implement this as MaxEncodedLen inherits from Encode, we can't get away without it

fn max_encoded_len() -> usize {
Uint::<BITS, LIMBS>::max_encoded_len()
}
}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤔 these max_encoded_len values seems to be off,
I would have expected this test to pass

    #[test]
    fn test_scale_compact_max_len() {
        const_for!(BITS in [1, 2, 3, 7, 8, 9, 15, 16, 17, 29, 30, 31, 32, 33, 63, 64, 65, 127, 128, 129, 256, 384, 512, 535] {
            const LIMBS: usize = nlimbs(BITS);
            let value: CompactUint<BITS, LIMBS> = Uint::<BITS, LIMBS>::MAX.into();
            assert!(CompactUint::<BITS, LIMBS>::max_encoded_len(), value.encode().len());
        });
    }

@prestwich
Copy link
Collaborator

Closing as stale. We will support this once 4.0.0 comes out :)

more context here:
paritytech/parity-scale-codec#662

@prestwich prestwich closed this Dec 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants