Data compression is the compacting of data by reducing the number of bits that are stored or transmitted. Thus, the compressed data requires substantially less disk space than the initial one, so more content can be stored using the same amount of space. There are various compression algorithms which work in different ways and with a number of them only the redundant bits are removed, therefore once the information is uncompressed, there's no decrease in quality. Others erase unneeded bits, but uncompressing the data at a later time will result in reduced quality compared to the original. Compressing and uncompressing content needs a huge amount of system resources, particularly CPU processing time, so each and every hosting platform which uses compression in real time must have sufficient power to support that feature. An example how information can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of storing the entire code.

Data Compression in Web Hosting

The ZFS file system that operates on our cloud web hosting platform employs a compression algorithm identified as LZ4. The latter is significantly faster and better than every other algorithm on the market, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data faster than it is read from a hard disk drive, which improves the performance of sites hosted on ZFS-based platforms. Due to the fact that the algorithm compresses data really well and it does that quickly, we are able to generate several backups of all the content kept in the web hosting accounts on our servers every day. Both your content and its backups will require reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not influence the performance of the hosting servers where your content will be stored.