Google has introduced a new data compression algorithm, which the company believes will make the Internet faster for all users. Known as Zopfli, the open-source algorithm is said to increase data transfer speeds and reduce web page load times, compressing content by up to 8 percent smaller than the zlib software library.
According to Google, the new algorithm is named after a Swiss bread recipe and is implementation of the Deflate algorithm, which is used in the popular ZIP archive format, as well as in gzip file compression.
“The smaller compressed size allows for better space utilization, faster data transmission, and lower web page load latencies. Furthermore, the smaller compressed size has additional benefits in mobile use, such as lower data transfer fees and reduced battery use,” says Google in a blog post.
“The higher data density is achieved by using more exhaustive compression techniques, which make the compression a lot slower, but do not affect the decompression speed. The exhaustive method is based on iterating entropy modeling and a shortest path search algorithm to find a low bit cost path through the graph of all possible deflate representations.”
Google further explains in its blog that the output generated by Zopfli is about 3-8 percent smaller as compared to zlib at the maximum compression. Zopfli has been written in C and is a compression-only library. Zopfli is bit-stream compatible with compression used in gzip, Zip, PNG, HTTP requests, and others.
“Due to the amount of CPU time required — 2 to 3 orders of magnitude more than zlib at maximum quality — Zopfli is best suited for applications where data is compressed once and sent over a network many times, for example, static content for the web. By open sourcing Zopfli, thus allowing webmasters to better optimize the size of frequently accessed static content, we hope to make the Internet a bit faster for all of us,” Google concludes in the blog post.