且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

GZIP压缩级别对解压缩有影响吗

更新时间:2023-11-10 08:36:04

很好的问题,以及曝光不足的问题。您的直觉很扎实-对于某些压缩算法,解压缩时选择最大压缩级别可能需要解压缩器进行更多的工作。

Great question, and an underexposed issue. Your intuition is solid – for some compression algorithms, choosing the max level of compression can require more work from the decompressor when it's unpacked.

幸运的是,对于gzip而言并非如此–客户端/浏览器解压缩更多压缩的gzip文件没有额外的开销(例如,假设大多数服务器使用标准zlib代码库,则选择9进行压缩而不是6)。***的衡量标准是解压缩率,目前,该解压缩率以MB /秒为单位,同时还监视内存和CPU等开销。仅仅通过解压缩时间是不好的,因为在较高的压缩设置下文件会变小,并且如果我们仅使用秒表,则我们不会控制该因素。

Luckily, that's not true for gzip – there's no extra overhead for the client/browser to decompress more heavily compressed gzip files (e.g. choosing 9 for compression instead of 6, assuming the standard zlib codebase that most servers use). The best measure for this is decompression rate, which for present purposes is in units of MB/sec, while also monitoring overhead like memory and CPU. Simply going by decompression time is no good because the file is smaller at higher compression settings, and we're not controlling for that factor if we're only using a stopwatch.


  1. gzip解压缩一旦达到6级压缩内容,就可以在解压缩时间和内存使用方面迅速达到渐近状态。 MarcusMüller链接的测试结果中第7、8和9级的解压缩平坦时间,尽管这是整秒给出的粗粒度数据。

  1. gzip decompression quickly gets asymptotic in terms of both time-to-decompress and memory usage once you get past level 6 compressed content. The time-to-decompress flatlines for levels 7, 8, and 9 in the test results linked by Marcus Müller, though that's coarse-grained data given in whole seconds.

您还将在这些结果中注意到,对于0.1 MiB的所有压缩级别,解压缩的内存要求均保持不变。这简直令人难以置信,只是我们很少看到的出色软件。马克·阿德勒(Mark Adler)及其同事应为自己取得的成就提供大量支持。 gzip是一种非常不错的格式。

You'll also notice in those results that the memory requirements for decompression are flat for all levels of compression at 0.1 MiB. That's almost unbelievable, just a degree of excellence in software that we rarely see. Mark Adler and colleagues deserve massive props for what they achieved. gzip is a very nice format.

内存使用问题是有关开销的问题。真的没有。就浏览器解压缩速度而言,您在第9级上不会获得多少收益,但是您不会损失任何东西。

The memory use gets at your question about overhead. There really is none. You don't gain much with level 9 in terms of browser decompression speed, but you don't lose anything.

现在,请查看这些测试结果可获得更多质感。您会发现,使用9级压缩内容时,gzip的解压缩速率要比使用较低级别的gzip解压缩速率稍快一些(例如,在9级时,解压缩速率比6级时快约0.9%)。这是有趣和令人惊讶的。我不希望这个比率会增加。那只是一组测试结果–在其他情况下可能不适用(在任何情况下,差异都很小)。

Now, check out these test results for a bit more texture. You'll see how the gzip decompression rate is slightly faster with level 9 compressed content than with lower levels (at level 9, decomp rate is about 0.9% faster than at level 6, for example). That is interesting and surprising. I wouldn't expect the rate to increase. That was just one set of test results – it may not hold for other scenarios (and the difference is quite small in any case).

分注:预压缩静态文件是一个好主意,但我不建议在9级使用gzip。通过使用zopfli或 libdeflate 。 Zopfli是Google完善的gzip压缩程序。 libdeflate是新的,但非常出色。在我的测试中,它始终击败gzip-9,但仍然落后于zopfli。您还可以使用7-Zip创建gzip文件,它将始终击败gzip-9。 (在上文中,gzip-9是指使用Apache和nginx使用的规范gzip或zlib应用程序。)

Parting note: Precompressing static files is a good idea, but I don't recommend gzip at level 9. You'll get smaller files than gzip-9 by instead using zopfli or libdeflate. Zopfli is a well-established gzip compressor from Google. libdeflate is new but quite excellent. In my testing it consistently beats gzip-9, but still trails zopfli. You can also use 7-Zip to create gzip files, and it will consistently beat gzip-9. (In the foregoing, gzip-9 refers to using the canonical gzip or zlib application that Apache and nginx use).