Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapt maintenance mode pallet to MessageQueue dmp/xcmp pause #25

Conversation

fgamundi
Copy link
Contributor

@fgamundi fgamundi commented Feb 6, 2024

Adapts maintenance mode pallet to work with message queue pallet that replaces DmpQueu and XcmpQueue (partially), introduced by paritytech/polkadot-sdk#1246

@fgamundi fgamundi marked this pull request as ready for review February 7, 2024 14:11
Copy link
Contributor

@Agusrodri Agusrodri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Definitely a nice cleanup of the pallet :)

@fgamundi fgamundi merged commit dd58b62 into Moonsong-Labs:fg-polkadot-1.6.0 Feb 7, 2024
8 of 10 checks passed
@fgamundi fgamundi deleted the fg-polkadot-v1.6.0-maintenance-mode branch February 7, 2024 14:25
fgamundi added a commit that referenced this pull request Feb 20, 2024
* build and tests pass

* FungibleAdapter

* fmt

* Cleanup

* rust-src to rust-toolchain

* Fixed relay chain comments

* Allow pallet-account-set warning

* Add reminder comment

* BeforeAllRuntimeMigrations bound for MaintenanceMode pallet hooks

* Adapt maintenance mode pallet to MessageQueue dmp/xcmp pause (#25)

* Adapt maintenance mode pallet to MessageQueue dmp/xcmp pause

* Remove hooks and add QueuePausedQuery tests

* Change template chain spec builder to full config

* Added sp_genesis_builder impl to runtime

* Removed unsued maintenance mode types

* development_config spec properties

* Fix mocks

* Remove todo

* Cleanup
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants