In an era of big data, high-speed, reliable, cheap and scalable databases are no luxury. Our friends over at SQream Technologies invest a lot of time and effort into providing their customers with the best performance-at-scale. Dr. Benjamin C. van Zuiden of SQream wrote a special report, “Beyond the Delta: Compression is a Must for Big Data,” that focuses on compression algorithms that make big data-at-scale possible.
Beyond the Delta: Compression is a Must for Big Data
In an era of big data, high-speed, reliable, cheap and scalable databases are no luxury. Our friends over at SQream invest a lot of time and effort into providing their customers with the best performance-at-scale. As such, SQream DB uses state-of-the-art HPC techniques. Some of these techniques rely on modifying existing algorithms to external technological […]
Building Fast Data Compression Code for Cloud and Edge Applications
Finding efficient ways to compress and decompress data is more important than ever. Compressed data takes up less space and requires less time and network bandwidth to transfer. In this article, we’ll discuss the data compression functions and the latest improvements in the Intel® Integrated Performance Primitives (Intel® IPP) library.