Redis announces major AI advancements and intention to acquire Decoable at Redis Released 2025

Redis outlines an expansive AI strategy featuring significant acquisitions and innovative services, propelling its growth as a key player in AI infrastructure.

Redis, the world's fastest data platform, recently unveiled an impactful expansion to its AI strategy during the Redis Released 2025 event. The keynote address by CEO Rowan Trollope highlighted several key initiatives, including the acquisition of Decodable, the introduction of LangCache, and numerous advancements to bolster Redis' position as a critical infrastructure layer for AI applications.

"As AI enters its next phase, the challenge isn't proving what language models can do; it's giving them the context and memory to act with relevance and reliability," Trollope noted. He emphasised how Redis' strategic acquisition of Decodable will streamline data pipeline developments, enabling data conversion into actionable context swiftly and efficiently within Redis.

Decodable, established by Eric Sammer, offers a serverless platform that simplifies the ingestion, transformation, and delivery of real-time data. By joining forces with Redis, Decodable aims to enhance AI capabilities and seamlessly connect developers with real-time data sources.

Redis also premiered LangCache, a fully-managed semantic caching service that cuts latency and token usage by up to 70% in LLM-reliant applications. The caching solution optimises performance and reduces costs significantly, supporting Redis’ mission to bolster AI agent efficiency.

The key advantages of LangCache include:

  • Up to 70% reduction in LLM API costs in high-traffic scenarios
  • 15x faster response times for cache hits compared to live LLM inference
  • Enhanced user experiences with lower latency and consistent outputs

Redis continuously adapts to the swift advancements in AI. Recent integrations make it easier for developers to leverage existing AI frameworks and tools. New tools, such as AutoGen and Cognee, along with LangGraph enhancements, provide scalable memory solutions for agents and chatbots.

Developers can now:

  • Utilise AutoGen for a fast-data memory layer
  • Leverage Cognee to manage memory through summarisation and reasoning
  • Implement LangGraph enhancements for reliability

Additional Redis for AI Enhancements

Redis' evolution continues with key improvements in hybrid search and data compression technologies within AI applications. Introduced upgrades include:

  • Hybrid search improvements using Reciprocal Rank Fusion, integrating text and vector rankings
  • Support for int8 quantised embeddings, yielding 75% memory savings and 30% faster search speeds

These latest updates ensure that Redis remains a pivotal platform for developing high-quality, reliable AI agents and frameworks.

Geotab Inc. celebrates a significant milestone, reinforcing its global leadership in connected...
Hyland introduces Enterprise Context Engine and Enterprise Agent Mesh, leading a transformative AI...
CrowdStrike expands its Next-Gen SIEM capabilities with Onum, streamlining security operations and...
Exterro unveils its latest innovation, Exterro Assist for Data, with an agentic AI approach. This...
Sage launches an innovative analytics engine to enhance HR decision-making, improve employee...
HANDD Business Solutions teams up with Cloudhouse to modernise legacy systems and protect sensitive...
HGF partners with Simpson Associates to transform their data estate, fortifying their position as a...
vivenu introduces Customer Segments, a feature transforming how event organisers engage and...