Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release v1.0.1, via point-release #5021

Closed
5 tasks
conorsch opened this issue Jan 29, 2025 · 1 comment
Closed
5 tasks

Release v1.0.1, via point-release #5021

conorsch opened this issue Jan 29, 2025 · 1 comment
Assignees

Comments

@conorsch
Copy link
Contributor

conorsch commented Jan 29, 2025

Tooling Release

In order to ship some minor improvements and bug fixes, let's prepare a v1.0.1. release, flushing out the current contents of the main branch. In particular, we want to provide bugfixes for regressions introduced in the recent v1 release (#4991).

Changes to include

Compatibility

As this is a point-release, all changes must be fully compatible for all nodes and clients.
Careful attention should be given to the delta between most recent tag on the main branch:
https://github.com/penumbra-zone/penumbra/compare/v1.0.0..main

@conorsch conorsch self-assigned this Jan 29, 2025
conorsch added a commit that referenced this issue Jan 29, 2025
## Describe your changes
In bumping the various metrics-related crates recently, we overlooked
that the specific metrics emitted by the `metrics-process` crate had
vanished. Bumping the dependency resolves.

Updates the pd metrics tests to check for more patterns. These checks
were failing on the main branch, specifically the `pd_process_*`
patterns.

Also refactors the metrics test to use `rstest`, for more granular
output.

## Issue ticket number and link

Refs #5021

## Testing and review

I've already confirmed that the new tests failed on main, so CI passing
is enough to merge here.

## Checklist before requesting a review

- [x] I have added guiding text to explain how a reviewer should test
these changes.

- [x] If this code contains consensus-breaking changes, I have added the
"consensus-breaking" label. Otherwise, I declare my belief that there
are not consensus-breaking changes, for the following reason:

  > metrics and tests only, does not affect consensus logic
@conorsch
Copy link
Contributor Author

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant