Our friends over at Kin+Carta know that optimizing the full value of data and figuring out where to start can be difficult. That is why the company has authored this whitepaper on ways to make data work in four clear ways, while helping you take yours from Storage to Story, from modernization through to product optimization. The company’s approach is focused on creating digital products with data to enhance customer and business outcomes.
From Storage to Story: Delivering New Value by Unlocking the Power of Data
Infinidat Expands InfiniBox Line with New Solid-State Array to Deliver High Performance for the Most Demanding Enterprise Applications
Infinidat, a leading provider of enterprise-class storage solutions, announced the new InfiniBox SSA™, a groundbreaking solid-state array that delivers the industry’s highest performance for the most demanding enterprise applications. The InfiniBox SSA is powered by Infinidat’s proven deep learning software algorithms and extensive DRAM cache. It will consistently deliver performance and latency results that surpass all-flash arrays (AFAs), while providing the same acclaimed customer experience, 100 percent availability, and uncompromising reliability of the InfiniBox.
Four Reasons On-premises Object Storage is Right for Today’s Businesses
In this special guest feature, Marcel Hergaarden, senior manager for product marketing at Red Hat, explains why he believes on-premises object-based storage is the correct approach for organizations that want better control over their data and greater cost savings.
The Value of Data Now vs. Data Later
In this contributed article, Fluency CEO and Founder Chris Jordan discusses the inevitable extinction of Moore’s law. 90% of the world’s data has been produced over the last two years, yet companies only analyze 12% of it. With Big Data only continuing to grow, how can more innovative data storage solutions, such as the cloud, effectively respond to this level of growth?
New Study Details Importance of TCO for HPC Storage Buyers
Total cost of ownership (TCO) now rivals performance as a top criterion for purchasing high-performance computing (HPC) storage systems, according to an independent study published by Hyperion Research. The report, commissioned by our friends over at Panasas®, a leader in HPC data storage solutions, surveyed data center planners and managers, storage system managers, purchasing decision-makers and key influencers, as well as users of HPC storage systems.
New Study Details Importance of TCO for HPC Storage Buyers
Total cost of ownership (TCO) is often assumed to be an important consideration for buyers of HPC storage systems. Because TCO is defined differently by HPC users, it’s difficult to make comparisons based on a predefined set of attributes. With this fact in mind, our friends over at Panasas commissioned Hyperion Research to conduct a worldwide study that asked HPC storage buyers about the importance of TCO in general, and about specific TCO components that have been mentioned frequently in the past two years by HPC storage buyers.
NVIDIA’s New Data Science Workstation – a Review and Benchmark
This new whitepaper from NVIDIA’s Authorized Channel Partner, PNY Technologies, tests and reviews the recently released Data Science Workstation, a PC that puts together all the Data Science hardware and software into one nice package. The workstation is a total powerhouse machine, packed with all the computing power—and software—that’s great for plowing through data.
Qumulo Offers Free Cloud Software to help Fight COVID-19 Outbreak
Today Qumulo announced it is offering its cloud-native file software, for free, to public and private sector medical and healthcare research organizations that are working to minimize the spread and impact of the COVID-19 virus.
insideAI News Guide to Optimized Storage for AI and Deep Learning Workloads – Part 3
Artificial Intelligence (AI) and Deep Learning (DL) represent some of the most demanding workloads in modern computing history as they present unique challenges to compute, storage and network resources. In this technology guide, insideAI News Guide to Optimized Storage for AI and Deep Learning Workloads, we’ll see how traditional file storage technologies and protocols like NFS restrict AI workloads of data, thus reducing the performance of applications and impeding business innovation. A state-of-the-art AI-enabled data center should work to concurrently and efficiently service the entire spectrum of activities involved in DL workflows, including data ingest, data transformation, training, inference, and model evaluation.
insideAI News Guide to Optimized Storage for AI and Deep Learning Workloads – Part 2
Artificial Intelligence (AI) and Deep Learning (DL) represent some of the most demanding workloads in modern computing history as they present unique challenges to compute, storage and network resources. In this technology guide, insideAI News Guide to Optimized Storage for AI and Deep Learning Workloads, we’ll see how traditional file storage technologies and protocols like NFS restrict AI workloads of data, thus reducing the performance of applications and impeding business innovation. A state-of-the-art AI-enabled data center should work to concurrently and efficiently service the entire spectrum of activities involved in DL workflows, including data ingest, data transformation, training, inference, and model evaluation.