Data compression is the compacting of data by decreasing the number of bits that are stored or transmitted. This way, the compressed data will take considerably less disk space than the original one, so extra content can be stored using the same amount of space. You will find various compression algorithms that function in different ways and with some of them only the redundant bits are erased, so once the data is uncompressed, there's no decrease in quality. Others delete unneeded bits, but uncompressing the data afterwards will result in lower quality in comparison with the original. Compressing and uncompressing content takes a huge amount of system resources, particularly CPU processing time, therefore each and every web hosting platform that uses compression in real time needs to have enough power to support that feature. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many sequential 1s or 0s there should be instead of storing the whole code.
Data Compression in Cloud Website Hosting
The ZFS file system that is run on our cloud Internet hosting platform employs a compression algorithm called LZ4. The aforementioned is considerably faster and better than any other algorithm on the market, especially for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the overall performance of Internet sites hosted on ZFS-based platforms. Because the algorithm compresses data very well and it does that very fast, we can generate several backups of all the content kept in the cloud website hosting accounts on our servers on a daily basis. Both your content and its backups will take reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not influence the performance of the hosting servers where your content will be kept.