Data compression is the compacting of info by lowering the number of bits that are stored or transmitted. This way, the compressed data requires much less disk space than the initial one, so much more content could be stored on the same amount of space. You can find various compression algorithms that work in different ways and with several of them only the redundant bits are deleted, therefore once the data is uncompressed, there's no loss of quality. Others delete unnecessary bits, but uncompressing the data subsequently will lead to reduced quality compared to the original. Compressing and uncompressing content consumes a huge amount of system resources, in particular CPU processing time, therefore every web hosting platform that uses compression in real time needs to have enough power to support that attribute. An example how info can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of keeping the entire code.