Cloudera will leverage NVIDIA AI Enterprise, with NVIDIA NIM microservices, to unlock the potential of 25 exabytes of enterprise data secured in Cloudera Data Platform
Cloudera, the data company for trusted enterprise AI, today announced its expanded collaboration with NVIDIA. Cloudera Powered by NVIDIA will integrate enterprise-grade NVIDIA NIM microservices, part of the NVIDIA AI Enterprise software platform, into Cloudera Machine Learning, a Cloudera Data Platform service for AI/ML workflows, to deliver fast, secure, and simplified end-to-end generative AI workflows in production.
Enterprise data, combined with a comprehensive full-stack platform optimized for large language models (LLM), plays a critical role in advancing an organization’s generative AI applications from pilot to production. NVIDIA NIM and NeMo Retriever microservices let developers link AI models to their business data — including text, images, and visualizations, such as bar graphs, line plots, and pie charts — to generate highly accurate, contextually relevant responses. Developers using these microservices can deploy applications through NVIDIA AI Enterprise, which provides optimized runtimes for building, customizing, and deploying enterprise-grade LLMs. By leveraging NVIDIA microservices, Cloudera Machine Learning will enable customers to unleash the value of their enterprise data under Cloudera management by bringing high-performance AI workflows, AI platform software, and accelerated computing to the data – wherever it resides.
Cloudera will introduce multiple integrations with NVIDIA microservices. Cloudera Machine Learning will integrate model and application serving powered by NVIDIA microservices to boost model inference performance across all workloads. With this new AI model-serving functionality, customers can achieve fault-tolerance, low-latency serving and auto-scaling for models deployed anywhere – from both public and private clouds. Additionally, Cloudera Machine Learning will offer integrated NVIDIA NeMo Retriever microservices to simplify the connection of custom LLMs to enterprise data. This capability will enable users to build retrieval-augmented generation (RAG)-based applications for production use.
Cloudera previously worked with NVIDIA to harness GPU-optimized data processing through the integration of the NVIDIA RAPIDS Accelerator for Apache Spark into the Cloudera Data Platform. Now, with the planned addition of NVIDIA microservices and integration with NVIDIA AI Enterprise, Cloudera Data Platform will uniquely deliver streamlined end-to-end hybrid AI pipelines.
Moving forward, organizations across industries will have the ability to more quickly and intuitively build, customize, and deploy LLMs that underpin transformative generative AI. This includes applications such as coding co-pilots for speeding development time, chatbots for automating customer interactions and services, text summarization apps for processing documents quickly, streamlined and contextual search, and much more. These innovations maximize time-to-business value by making data and advanced AI processes easier and faster across the enterprise, increasing revenue generation and optimizing cost.
“Cloudera is integrating NVIDIA NIM and CUDA-X microservices to power Cloudera Machine Learning, helping customers turn AI hype into business reality,” said Priyank Patel, Vice President of AI/ML Products at Cloudera. “In addition to delivering powerful generative AI capabilities and performance to customers, the results of this integration will empower enterprises to make more accurate and timely decisions while also mitigating inaccuracies, hallucinations, and errors in predictions – all critical factors for navigating today’s data landscape.”
“Enterprises are eager to leverage their massive volumes of data for generative AI to build custom copilots and productivity tools,” said Justin Boitano, Vice President of Enterprise Products at NVIDIA. “The integration of NVIDIA NIM microservices into the Cloudera Data Platform offers developers a way to more easily and flexibly deploy LLMs to drive business transformation.”
Sign up for the free insideAI News newsletter.
Join us on Twitter: https://twitter.com/InsideBigData1
Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/
Join us on Facebook: https://www.facebook.com/insideAI NewsNOW
Speak Your Mind