Skip to content

Conversation

bricss
Copy link

@bricss bricss commented Jul 26, 2025

This PR is an attempt to bring support for br & zstd compression πŸ—œοΈ algos into the core.

Copy link
Contributor

@kanongil kanongil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Zstd compression cannot be supported by hapi itself, as the API is still experimental. This will have to be implemented as a custom compressor / decompressor by the user (or a plugin).

I still don't see much point in adding built-in brotli support, which is already supported using a plugin. The built-in gzip compression makes sense, since it is widely supported and can provide major bandwidth savings. Newer compressors only add a marginal benefit and can have other trade-offs. As such they should only be enabled when configured by the user.

I like that it is possible to customise the encoder priority, but your implementation doesn't support user-defined encoders. A proper implementation would be more complex, and should probably go in a separate PR.

const internals = {
common: ['gzip, deflate', 'deflate, gzip', 'gzip', 'deflate', 'gzip, deflate, br']
common: [
'gzip, deflate, br, zstd',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It makes sense to update the common encoding with 'gzip, deflate, br, zstd', as this is used by both Chrome and Firefox. This can go in a separate PR.

@bricss
Copy link
Author

bricss commented Jul 28, 2025

I really don't understand the reason why br & zstd algos cannot be supported by the core. Both of them widely supported and adopted.

Zstd can be disabled by default, and it should solve its experimental status issue. Or even better, both algos can be disabled by default allowing usage of user-defined encoders.

Marginal benefits and overall progress is what most of the users usually are looking for.

@bricss
Copy link
Author

bricss commented Jul 29, 2025

I made changes to make both br & zstd algos as an opt-in options and disabled by default. This also will allow a usage of user-defined encoders to override built-in support.

@bricss bricss requested a review from kanongil July 29, 2025 10:37
@damusix
Copy link
Contributor

damusix commented Aug 1, 2025

This looks sane. I think it makes sense to have this be framework-supported out of the box since, especially given that it simply leverages NodeJS builtins. Even if they are experimental, it helps if the framework ships with it. to @kanongil 's point, however, I would add a notice and a link to the MDN docs explicitly stating that Zstd compression is experimental. Thank you @bricss

@bricss
Copy link
Author

bricss commented Aug 2, 2025

I added a notice πŸ“ about the experimental πŸ₯Ό status. And also the ability to pass compression options πŸŽ›οΈ to both algorithms.

@damusix
Copy link
Contributor

damusix commented Aug 2, 2025

@bricss Hey check out these skip patterns:

it('cleans unused file stream when response is overridden', { skip: !Common.hasLsof }, async () => {
it('creates a server listening on a unix domain socket', { skip: process.platform === 'win32' }, async () => {

We're going to need to add some skip patterns for zstd tests so they only run over node 22+. That said, it should also be documented that it's Node 22+.

Add the util in ./test/common.js and check against process.version
@hapi/somever is installed already so you can leverage that (how to example here)

@bricss
Copy link
Author

bricss commented Aug 3, 2025

All done βœ… added node version note πŸ“„ and test skippers 🚧

@bricss
Copy link
Author

bricss commented Aug 3, 2025

I found a better way to toast 🍞 against zstd support and added extra assertion πŸ₯½

Copy link
Contributor

@kanongil kanongil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regarding Zstd support, I don't think it makes sense to make an exception for the policy around node API feature stability. Especially because there is a standard way to integrate custom compressors.

For brotli, I still don't think we need to include it in core, but I'm not opposed to it, if done properly. This implementation still has several issues, most pressingly a default compression level of 11, which is way too high for just-in-time compression. The default should probably not be more computationally expensive than the default gzip option. A quick test on a 300KB json file shows gzip to take 20 ms, while brotli takes 335 ms! For this file, brotli needs to use level 5 to have comparable speed. I would probably choose 4 as default, improving both compression speed and a size.

I wish you would make a separate PR for the priority option, which provides a meaningful feature by itself.

@bricss
Copy link
Author

bricss commented Aug 3, 2025

The issue with br compression level is solved βœ…

As for the experimental status of zstd, if user gonna enable it manually, it kinda implies that user smart πŸŽ“ enough to understand what he/she/ze is doing πŸ™‚

@kanongil
Copy link
Contributor

kanongil commented Aug 3, 2025

Hapi core doesn't have "experimental" features, and built-in zstd compression certainly doesn't warrant introducing such a concept

@bricss
Copy link
Author

bricss commented Aug 3, 2025

I couldn't find any policy that mentioned anything about experimental features is not being allowed.

The only reason why zstd has experimental status is bcos it was recently added to the node core. It uses the official rock-solid πŸͺ¨ zstd library leveraged via zlib. Its experimental status is due to the amount of time ⌚ it needs to "marinate" πŸ₯’ in the codebase before it can be unflagged.

@bricss
Copy link
Author

bricss commented Aug 3, 2025

Meanwhile, @fastify/compress & hono-compress already have zstd support for months πŸ—“οΈ

@damusix
Copy link
Contributor

damusix commented Aug 4, 2025

@kanongil It kills the ecosystem to not support things like this. I'll all for stability and not breaking user space, but this shouldn't break user spaceβ€” only enhance it. It just enables Hapi to use modern JS features. If people are using Hapi, want these features, and contribute, I can't think of a good reason to not allow it.

@bricss Sorry mate, one last thing: you'll need to tell Lab to not cover those lines since it won't be covered in anything under Node 22 (https://hapi.dev/module/lab/api/?v=26.0.0#inline-enablingdisabling). Screenshot 2025-08-04 at 12 42 04β€―PM

@bricss
Copy link
Author

bricss commented Aug 4, 2025

Pushed cure 🩹 for coverage. Hope it works πŸ™‚

@bricss bricss force-pushed the feat/add-br+sztd-with-priority-opts branch from 6d07439 to f07fa73 Compare August 5, 2025 10:15
@bricss
Copy link
Author

bricss commented Aug 5, 2025

Rebased on master branch 🌿

@kanongil
Copy link
Contributor

kanongil commented Aug 5, 2025

@kanongil It kills the ecosystem to not support things like this. I'll all for stability and not breaking user space, but this shouldn't break user spaceβ€” only enhance it. It just enables Hapi to use modern JS features. If people are using Hapi, want these features, and contribute, I can't think of a good reason to not allow it.

That is quite the hot take. If it "kills the ecosystem" to have to use an alternative, slightly more complex, but more powerful API, then it is a very fragile ecosystem indeed. Many/most? deployments won't care about this, as the compression can already be applied in reverse proxies and CDNs.

Implementing features just because a user wants it, is never a good path. Features need to be maintained indefinitely, causing more maintenance work. As such, it should provide a tangible benefit before it should be considered. As I said, the "priority" option is an interesting candidate. Extending the built-in compressors less so, as it doesn't enable anything new. Using experimental APIs, exposing experimental features, and dropping the 100% code coverage policy, even less so!

I understand that you want to support new contributors, as do I, especially since @bricss seems to care a lot about doing it right. That is why I try to help guide their efforts. Unfortunately for new contributors, this project has (or at least used to have) a high coding standard, making it difficult to get anything but small fixes merged.

@bricss
Copy link
Author

bricss commented Aug 5, 2025

What kind of maintenance work it may require? Coz, it seems that current algos ain't causing any work, at all πŸ’€

Shall it wait until zstd gets "experimental" flag 🚩 removed?

@damusix
Copy link
Contributor

damusix commented Aug 5, 2025

"kills the ecosystem"

Maybe I misspoke. I meant in the sense of contributions and getting eyes and hands on this project. It's my opinion that the framework should offer support for these native builtins related to what it does (http transmission). The competition offers it and we stay behind. That's generally bad for our marketshare.

Implementing features just because a user wants it, is never a good path. Features need to be maintained indefinitely, causing more maintenance work. As such, it should provide a tangible benefit before it should be considered.

I agree, but think there's an asterisk with this in particular: what this PR proposes (compression support) is a stable NodeJS feature. The API for compression hasn't changed, so I'm not sure what there is to maintainβ€” complexity here is quite low. We have also seen people drop by and ask for Brotli compression in Hapi in the past. You've have even made a Plugin to support it. It honestly seems like such a trivial feature-add that it's worth it.

Using experimental APIs, exposing experimental features, and dropping the 100% code coverage policy, even less so!

Zstd seems like it's getting wide adoption, eg: Python's adopting it in upcoming release: https://docs.python.org/3.14/whatsnew/3.14.html
Can't imagine NodeJS dropping support for this given that major cloud providers and big tech are using it.

On the code-coverage critique: Yeah, 100% fair. Lets uphold the high standards. Lets say we came to a consensus, how would you approach code coverage for this, since it is only Node 21+? We can't stay in Node 14 support forever. At some point, we have to move with the times. Node 18 is now legacy and unsupported by all major providers.

@bricss
Copy link
Author

bricss commented Aug 5, 2025

I'd also like to add that I'm trying to add 🧩 these features to the framework 🧰 not just bcos I personally need them, but bcos I believe it'll be beneficial for the framework as a whole, and help attract 🧲 more users. This should be a win-win πŸ† for everyone β€” both the community and its users.

I'm already using these features in production, so if it were only about my own needs, its already covered πŸ™‚

@kanongil
Copy link
Contributor

kanongil commented Aug 6, 2025

On the code-coverage critique: Yeah, 100% fair. Let's uphold the high standards. Let's say we came to a consensus, how would you approach code coverage for this, since it is only Node 21+? We can't stay in Node 14 support forever. At some point, we have to move with the times. Node 18 is now legacy and unsupported by all major providers.

The simple solution is to punt it off to be handled by a 3rd party plugin. Ie. what we already do. To make it more palatable, hapi could make and publish one such plugin. User installed plugins don't need to apply the same standards as Hapi core. For example Inert uses a third party code dependency. All?? other frameworks require plugins for even gzip compression.

Again, regarding brotli, I'm not opposed to including it, as such. Though this implementation still has both major and minor issues. The majors being that it introduces a new conflicting way to specify compression options, and have no way to only enable only the compressor (with no decompression). A proper implementation would probably also revisit the current gzip/deflate integration, and allow these to be toggled on/off for consistency.

@kanongil
Copy link
Contributor

kanongil commented Aug 6, 2025

I'd also like to add that I'm trying to add 🧩 these features to the framework 🧰 not just bcos I personally need them, but bcos I believe it'll be beneficial for the framework as a whole, and help attract 🧲 more users. This should be a win-win πŸ† for everyone β€” both the community and its users.

@bricss I very much appreciate your enthusiasm and efforts, and I apologise that our responses are not more aligned, making you get caught in the middle. Though our contribution guidelines did advise you to "ask first", to avoid working on something that might not get accepted.

@damusix
Copy link
Contributor

damusix commented Aug 6, 2025

@kanongil @bricss Would bringing in brok make sense? And making it a general purpose compression plugin? Meaning, it covers the compression APIs Node offers (including experimental).

@bricss
Copy link
Author

bricss commented Aug 6, 2025

IMO, we can withhold zstd from this PR, and I’ll make all necessary changes to address remaining issues.

Alternatively, we could move compression πŸ—œοΈ entirely out of the core, into a standalone package πŸ“¦ with the name @hapi/compress. But then it will be the same thing as now, only from the other side 🫠

@damusix
Copy link
Contributor

damusix commented Aug 7, 2025

IMO, we can withhold zstd from this PR, and I’ll make all necessary changes to address remaining issues.

Yeah, makes sense

Alternatively, we could move compression πŸ—œοΈ entirely out of the core, into a standalone package πŸ“¦ with the name @hapi/compress. But then it will be the same thing as now, only from the other side 🫠

That's what I was proposing to do, and perhaps use Gil's package as the base since he's already laid the ground work for it (up to you Gil). Thoughts?

@kanongil
Copy link
Contributor

kanongil commented Aug 7, 2025

I thought about creating a general-purpose compression plugin, but moving compression wholesale into a plugin would probably go against the "full out-of-the-box functionality" hapi promise.

Making people install a plugin for experimental zstd support does seem quite appropriate. I expect it will work best as a standalone feature. This also means that it can require node 22+, and ignore old versions. Someone will need to create/implement a new @hapi/zstd-compress (or similar) package, though.

I'm looking into reworking the current implementation to better support new encoders, and to allow built-in encoders/decoders to be selectively disabled/enabled. If merged, this will make it simpler to extend with new encoders and a priority option.

@bricss
Copy link
Author

bricss commented Aug 7, 2025

For "full out-of-the-box functionality" it will make more sense just to add missing algos into the core, bump min supported node version to 22.15.0, and release new semver major. No one need support of all those fossil ⚱️ node versions in any way. And if they do, shame on them.

@bricss bricss force-pushed the feat/add-br+sztd-with-priority-opts branch from f07fa73 to 22fac0d Compare September 14, 2025 09:20
@bricss
Copy link
Author

bricss commented Sep 14, 2025

Rebase and refactored implementation, so that every single encoding can be switched ✨ on/off and options can be passed to all of em πŸ“Ž

Copy link
Contributor

@kanongil kanongil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We still don't need compression options at the server level. The route level options work fine.

Another major issue, is that this enables br and zstd for uploads by default, with no option to disable. Enabling new upload decompressor engines has both security and performance implications, and should not be done without a thorough review.

This review is mainly aimed at the hapi maintainers. I don't plan to spend more time on this PR.

@bricss
Copy link
Author

bricss commented Sep 14, 2025

Both br & zstd disabled by default, therefore not enabled for uploads by default.

@77fdvarvam
Copy link

Thanks everyone for the detailed discussion. It seems there’s agreement that Brotli support in core could make sense (with sane defaults), while Zstd probably belongs in a plugin until it’s no longer flagged experimental in Node. I like the direction of making encoders toggleable and options overridable per route vs. server level. That seems like a flexible middle ground that preserves Hapi’s standards while still giving developers modern compression options.

@bricss
Copy link
Author

bricss commented Sep 15, 2025

Changed implementation as default options already can be configured at the server default routes configuration or individually per route.

@bricss
Copy link
Author

bricss commented Sep 15, 2025

Added option to disable β›” automatic decompression.

@bricss
Copy link
Author

bricss commented Sep 15, 2025

Summary πŸ“‹ list:

  • Built-in encodings πŸ‘Ύ can be enabled or disabled
  • Encoder/decoder options πŸŽ›οΈ can be set globally in the server configuration or overridden per route
  • Decompression can be disabled, which will result in a 415 responses β›”
  • Priority πŸ”’ for content encoding compression algorithms can be changed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants