A cache is a piece of storage that is used to reduce the performance impact that a bottleneck can have on your system.
For example, the cache of a processor(CPU) is a small piece of memory, size varying between less than a megabyte to several MB’s, in which frequently accessed data is stored, so that it doesn’t have to be loaded from the relatively slow RAM. That’s how “caching” reduces the data that needs to be transmitted and received, so that more bandwidth is available for other operations.
Another example of caching is the cache in a hard drive(HDD). This is usually a somewhat bigger cache(about 32 to 64MB on newer drives). The cache is used as a sort of buffer, where several small write-operations can be stored before being written to the HDD, so that the OS can continue with other things, and the HDD can stay at a constant(slower) write speed. Also, while the data is in the cache, the HDD can arrange them in an optimal order to improve write speeds(check out AHCI). This is called Native Command Queuing(NCQ).
So, caching speeds up your system, because devices don’t have to wait for each other to finnish a task, thanks to the small but fast piece of memory called cache.
Be careful though. Cache’s are usually volatile memory. Which means, that when the power fails, all data stored in it is lost. This can be tricky with big HDD caches. You think a piece of data is already written to the HDD, but in fact it’s still in the cache, waiting in the queue.