The term data compression identifies lowering the number of bits of info which has to be saved or transmitted. This can be done with or without the loss of information, which means that what will be erased during the compression will be either redundant data or unnecessary one. When the data is uncompressed subsequently, in the first case the info and the quality shall be the same, while in the second case the quality shall be worse. You'll find different compression algorithms which are more effective for different kind of information. Compressing and uncompressing data normally takes a lot of processing time, which means that the server carrying out the action should have sufficient resources to be able to process the data fast enough. An example how information can be compressed is to store how many consecutive positions should have 1 and just how many should have 0 inside the binary code rather than storing the actual 1s and 0s.

Data Compression in Cloud Hosting

The ZFS file system which runs on our cloud web hosting platform employs a compression algorithm identified as LZ4. The latter is a lot faster and better than every other algorithm available on the market, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard drive, which improves the performance of Internet sites hosted on ZFS-based platforms. As the algorithm compresses data quite well and it does that quickly, we can generate several backup copies of all the content stored in the cloud hosting accounts on our servers every day. Both your content and its backups will require reduced space and since both ZFS and LZ4 work very fast, the backup generation will not affect the performance of the hosting servers where your content will be kept.