MapR awarded additional patent for Converged Data Platform

Company furthers innovation in optimal architecture for big data.

  • 7 years ago Posted in
MapR Technologies has been granted a patent (US9,501,483) from the United States Patent and Trademark Office. The awarded patent demonstrates the company’s technology advancements in architecting a modern data platform that reliably runs mission-critical applications. This patent, in particular, covers the key technology underpinning components of the MapR Converged Data Platform including the multi-modal NoSQL database (MapR-DB) and global streaming engine (MapR Streams).

 

“Strengthening our growing IP portfolio, this patent reinforces our commitment to allowing our customers to uniquely run both operational and analytical processing on a single platform,” said Matt Mills, CEO, MapR Technologies. “Unlike Apache Hadoop or alternative big data technologies, the patented Converged Data Platform provides a unified and fast access layer to any type of data. We enable companies to take advantage of next generation applications, creating innovation and advancing their business through digital transformation.”

 

The key components of the patent claims include protection for file, table and stream processing for the following technology advances:

 

·         Convergence - Fundamental integration of tables, files and streams into converged data platform

·         Fast processing with low latency - Ability to open tables without having to replay a log

·         High availability and Strong consistency – Provides continuous access and fast recovery while ensuring strong consistency

·         Security - Keeps secure snapshots and mirrors of all kinds of persistent data such as files, tables and streams. This can help avoid data loss even in extreme cases such as ransomware.

 

Leveraging these inventions, the MapR Converged Data Platform delivers a core architecture for data-centric businesses along these four key areas:

 

Enterprise-grade reliability in one platform Vast scale with mission critical reliability, disaster recovery, end-to-end security, and multi-tenancy let customers run next generation big data applications in a business-critical, 24x7 environment that must never lose data.

 

Global, real-time, continuous data processing – Full read-write capabilities, low administration, automated optimisations, and immediate access to data all enable an end-to-end real-time environment that lets analysts continuously leverage data for gaining critical business insights.

 

 

Continuous innovation -- A patented core with standard APIs drives greater value from Apache Hadoop/Spark and other open source projects. Advanced technologies allow greater scale and performance while compliance with community-driven open APIs such as industry standard POSIX, NFS, LDAP, ODBC, REST, and Kerberos allows all of the key open source big data systems to work with existing systems. 

 

Foundation for converged analytics – A platform that enables multiple workloads in a single cluster lets customers run continuous analytics on both data at rest and data in motion without the delay due to moving data to a task-specific cluster. Having a single cluster that can handle converged workloads is also easier to manage and secure.

A new report from Appsbroker & CTS shows lack of timely and accurate data is leading to missed...
Sophos has formed a strategic partnership with Tenable to provide Sophos Managed Risk, a worldwide...
Celonis and the BMW Group have significantly expanded their strategic partnership in order to...
Powerful combination of the AI-native CrowdStrike Falcon XDR platform and the Rubrik Security Cloud...
Cockroach Labs has formed a new partnership with Crayon, a global provider of software and cloud...
Research by Alteryx finds that data silos and quality pose (surmountable) challenges as IT teams...
With Splunk, Cisco becomes one of the largest software companies globally.
Proprietary AI engine identifies issues before an influx in help desk tickets.