Data compression is the compacting of data by lowering the number of bits which are stored or transmitted. In this way, the compressed information will require much less disk space than the original one, so extra content might be stored using identical amount of space. You'll find different compression algorithms which work in different ways and with many of them just the redundant bits are deleted, which means that once the info is uncompressed, there is no loss of quality. Others remove excessive bits, but uncompressing the data afterwards will result in reduced quality in comparison with the original. Compressing and uncompressing content takes a large amount of system resources, and in particular CPU processing time, so any hosting platform which uses compression in real time needs to have ample power to support this attribute. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many sequential 1s or 0s there should be instead of keeping the entire code.

Data Compression in Cloud Web Hosting

The compression algorithm which we use on the cloud web hosting platform where your new cloud web hosting account will be created is named LZ4 and it is applied by the leading-edge ZFS file system which powers the platform. The algorithm is a lot better than the ones other file systems work with since its compression ratio is a lot higher and it processes data considerably faster. The speed is most noticeable when content is being uncompressed since this happens quicker than data can be read from a hard disk drive. Because of this, LZ4 improves the performance of any Internet site hosted on a server that uses this algorithm. We use LZ4 in an additional way - its speed and compression ratio allow us to make multiple daily backups of the whole content of all accounts and store them for thirty days. Not only do these backup copies take less space, but their generation doesn't slow the servers down like it often happens with many other file systems.