Data compression is the compacting of info by lowering the number of bits which are stored or transmitted. Because of this, the compressed data will require less disk space than the initial one, so extra content could be stored using the same amount of space. You can find different compression algorithms that work in different ways and with many of them just the redundant bits are deleted, so once the info is uncompressed, there's no loss of quality. Others remove unneeded bits, but uncompressing the data later will lead to reduced quality compared to the original. Compressing and uncompressing content requires a huge amount of system resources, especially CPU processing time, therefore every web hosting platform which uses compression in real time should have adequate power to support that feature. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of storing the whole code.

Data Compression in Hosting

The compression algorithm that we use on the cloud internet hosting platform where your new hosting account shall be created is called LZ4 and it's used by the state-of-the-art ZFS file system which powers the platform. The algorithm is greater than the ones other file systems employ because its compression ratio is a lot higher and it processes data considerably faster. The speed is most noticeable when content is being uncompressed since this happens more quickly than information can be read from a hard disk drive. For that reason, LZ4 improves the performance of every site stored on a server that uses this particular algorithm. We use LZ4 in an additional way - its speed and compression ratio make it possible for us to make multiple daily backup copies of the whole content of all accounts and keep them for one month. Not only do our backup copies take less space, but also their generation won't slow the servers down like it can often happen with various other file systems.