Dataiku unveils LLM Mesh; reveals LLM Mesh launch partners

Dataiku has unveiled at its Everyday AI Conference in New York, the LLM Mesh, addressing the critical need for an effective, scalable, and secure platform for integrating Large Language Models (LLMs) in the enterprise. In addition, Dataiku is joined in this announcement by its LLM Mesh Launch Partners Snowflake, Pinecone, and AI21 Labs.

  • 2 months ago Posted in

While Generative AI presents a myriad of opportunities and benefits for the enterprise, organizations face notable challenges. These include an absence of centralized administration, inadequate permission controls for data and models, minimal measures against toxic content, the use of personally identifiable information, and a lack of cost-monitoring mechanisms. Additionally, many need help with establishing best practices to fully harness the potential of this emerging technology ecosystem. 

Building on Dataiku’s transformative Generative AI capabilities introduced in June 2023, the LLM Mesh is envisioned to overcome these roadblocks to enterprise value. 

LLM Mesh: The Common Backbone for Gen AI Apps 

LLM Mesh provides the components companies need to build safe applications using LLMs at scale efficiently. With the LLM Mesh sitting between LLM service providers and end-user applications, companies can choose the most cost-effective models for their needs, both today and tomorrow, ensure the safety of their data and responses, and create reusable components for scalable application development. 

Components of the LLM Mesh include universal AI service routing, secure access and auditing for AI services, safety provisions for private data screening and response moderation, and performance and cost tracking. The LLM Mesh also provides standard components for application development to ensure quality and consistency while delivering the control and the performance expected by the business. 

Learn more about delivering enterprise-grade Generative AI applications with the LLM Mesh here. 

Dataiku’s new features powering the LLM Mesh will be released in public and private previews starting in October.

Clément Stenac, Chief Technology Officer and co-founder at Dataiku shared, "The LLM Mesh represents a pivotal step in AI. At Dataiku, we're bridging the gap between the promise and reality of using Generative AI in the enterprise. We believe the LLM Mesh provides the structure and control many have sought, paving the way for safer, faster GenAI deployments that deliver real value.”  

Announcing the Dataiku LLM Mesh Launch Partners

Dataiku facilitates the effective and wide-ranging use of LLMs, vector databases, and various compute infrastructures in the enterprise, working to complement existing providers. This approach aligns with Dataiku's general philosophy of enhancing, rather than duplicating, the capabilities of existing technologies and making them accessible to everyone. Dataiku is pleased to announce its LLM Mesh Launch Partners, Snowflake, Pinecone, and AI21 Labs, who represent several of the key components of the LLM Mesh: containerized data and compute capabilities, vector databases, and LLM builders.

Torsten Grabs, Senior Director of Product Management at Snowflake, states, “We are excited about the vision of the LLM Mesh as we know the true value is not just getting LLM-powered applications to production — it’s about democratizing AI in a safe and secure manner. With Dataiku, we’re enabling our joint customers to deploy LLMs on their Snowflake data leveraging containerized compute from Snowpark Container Services within the security perimeter of their Snowflake accounts, all orchestrated by Dataiku to reduce friction and complexity and accelerate business value.”

Chuck Fontana, VP of Business Development at Pinecone, states, "LLM Mesh is more than an architecture—it's a pathway. Vector databases are new standards, powering AI applications through processes like Retrieval Augmented Generation. Together, Dataiku and Pinecone are setting a new standard, providing a way that others in the industry can align with, helping to overcome barriers the market faces in building enterprise-grade GenAI applications at scale. Pinecone looks forward to collaborating as an LLM Mesh Launch Partner."

Pankaj Dugar, SVP and GM, North America at AI21 Labs, states, "In today's evolving technological landscape, it's paramount that we foster a diverse, tightly integrated ecosystem within the Generative AI stack for the benefit of our customers. Our collaboration with Dataiku and the LLM Mesh underscores our commitment to this diversity, ensuring enterprises can access a broad spectrum top-tier, flexible and reliable LLM capabilities. We believe that diversity breeds innovation, and with Dataiku's LLM Mesh, we're stepping into a future of boundless AI possibilities."

New business metrics for Cisco Cloud Observability enable customers to significantly enhance critical business context when observing the end-to-end flow of modern applications.  
Transformational technologies, including AI-augmented software engineering (AIASE), AI coding assistants and platform engineering, will reach mainstream adoption in 2-5 years, according to the Gartner, Inc. Hype Cycle for Software Engineering, 2023.
New Innovation Factory to speed design and development of cloud, data, AI and generative AI projects.
Although most remain “unsure how it actually works”, 40% of C-level executives are planning to use AI and the advantages that can be gained through Generative AI (Gen AI) such as ChatGPT to cover critical skills shortages, according to new research by Kaspersky.
Civo has published the results of its research into the challenges faced by Machine Learning (ML) developers in their roles. With more businesses deploying ML, the research highlights the current hurdles faced and the high rate of project failure.
Now Assist in Virtual Agent, flow generation, and Now Assist for Field Service Management are the latest in powerful GenAI solutions to be embedded into the ServiceNow Platform.
Report unveils AI adoption rates for 2024 along with other tech and customer experience predictions.
New innovations in cloud threat detection give SOC teams the edge to pinpoint suspicious activity across their attack surface.