Redis Labs challenges new cloud territory in latest independent research

Key strengths include multi-platform and active:active operation


Sponsored 2020 is a landmark year for database management systems (DBMS). The research and advisory firm Gartner has published its first dedicated Cloud DBMS Magic Quadrant. Open-source in-memory NoSQL database provider Redis Labs is evaluated in the report.

Gartner's publication of a Cloud DBMS market evaluation is logical given that in June 2019, it predicted the cloud was the future of the DBMS market. The rationale? Cloud accounted for 68 per cent of the database market's growth between 2017 and 2018 (excluding cloud-hosted DBMS licences). Moreover, most of the innovation in databases was happening in the cloud-native area, it said.

Redis Labs has already been acknowledged in previous Gartner DBMS research, and it also achieved Gartner’s Peer Insights Customers' Choice for ODBMS products in 2019. With the publication of the new Magic Quadrant report, Gartner recognized Redis Labs furthest to the right on the Completeness of Vision axis in the Challengers Quadrant.

Several capabilities in Redis Enterprise Cloud as a product and a cloud-native service are noteworthy, including its multi-platform capability. The company offers its product as on-premises software, in the cloud, or as a hybrid deployment.

Redis Labs treats cloud-native operation as a strategic imperative. Like many other DBMS solutions, Redis Enterprise can be hosted in a cloud-based virtual machine but using it natively in the cloud as a managed service brings several benefits. These include automatic provisioning for scaling on demand. Container-based operation enables developers to create small, distributed microservice-based applications that operators can scale horizontally and manage en masse using Kubernetes.

Multi-platform cloud support

Redis Labs integrates its products into the different layers of cloud operation and has been busy optimising Redis Enterprise Cloud for each cloud provider. In November, it announced a deep Azure integration with Microsoft. Covered here by The Register, Azure Cache for Redis can be used either as a standalone in-memory database or as an in-memory cache to enhance the performance of other databases that enterprises already use.

Azure Cache for Redis hooks the database more tightly into the Azure service set with, horizontal scaling, and data persistence options. Enterprise customers can now set up Redis instances through the Azure Portal and CLI, and even control it programmatically via the Azure API.

Multi-platform operation is a key factor in Redis Labs strategy, which also includes sponsoring the open-source Redis project. Redis Enterprise Cloud extends beyond Azure and AWS to Google Cloud. Redis Labs has designed instances to talk to each other across these providers while also communicating with on-premises implementations in a hybrid or multi cloud arrangement.

Redis Enterprise Cloud is available on AWS and the company became an Advanced Technology Partner in the AWS Partner Network in September achieving AWS Outposts Ready designation in December. Companies only get these designations (which is the highest available) by delivering consistent real-world successes with customers in production environments.

Redis Enterprise is also available as a fully managed service on Google Cloud. The Database-as-a-Service tightly integrates with Google Cloud to automate the tasks required to deploy, manage, and scale Redis.

Active-active operation

This multi-platform complements another strength of Redis Enterprise Cloud: active-active operation. It enables Redis admins to create distributed high-availability architectures across wide geographic areas.

Redis Labs uses a structure called a conflict-free replicated database (CRDB), which uses database instances created across different physical clusters. These clusters can be distributed across data centres around the world, supporting as many as 1500 separate nodes.

This approach is a step beyond an active-passive architecture, which runs one node as the primary and another on standby. Active-passive's main use case is to migrate data from on-premises to the cloud or one cloud to the other active-active takes it a step further. When the primary node goes down, it takes time to relocate resources to the passive node. Moreover, one of nodes is always redundant so you are architecturally limited to seeing only 50% of your total available performance. On the other hand, active-active keeps all the nodes running at the same time, which enables in-memory Redis databases to fail over with local latency performance.

Redis Labs set out to solve one of the biggest problems in database replication, which is reconciling changes made across different nodes. It's important that nodes processing different changes don't fall out of sync, because otherwise administrators can't rely on any single node to have the most up-to-date comprehensive view of the network.

To meet this challenge, the CRDB uses a mechanism that doesn't require a separate consensus protocol to keep nodes in sync. It uses a conflict-free replicated data type (CRDT) for peer replication.

Under the principles of CRDT, a Redis CRDB instance preparing to make a write breaks the process into two steps. In the first, it prepares the user's request, producing data that the CRDB then distributes to all instances exactly once using a first-in, first-out model that underpins a predictable write process. This happens independently of the applications that connect to Redis.

In a multi-site installation, all writes are sent to all active nodes. The system supports concurrent reads and writes, sending incoming requests to other sites immediately if something goes down. The company claims five-nines uptime figures using this mechanism, but also maintains sub-millisecond latency because writes don't require a lock on the database.

Multi-model architecture

Redis Enterprise Cloud supports multiple data types and models. Traditional relational databases typically support basic tabular data models built by joining relational tables. Some of them supported BLOB and geospatial data storage, but shoehorning support for these concepts into a relational model usually requires add-ons, introducing extra complexity and dependency issues where it's available at all.

Even SQL, the language used to access relational systems, doesn't include support for features like ordered data sets. This leaves software developers with a lot of heavy lifting to massage data into suitable forms for use on the application side.

As a NoSQL database, Redis is based on key-value stores, but over time native support for other data structures have been built into its engine, such as lists, sorted sets, and streams. It has also expanded these data structures through its use of Redis modules. These include JSON (the lingua franca for delivering data via cloud APIs), full-text search for search applications, graph for relationships and time series for timestamped data.

Redis Enterprise Cloud allows multi-model operations across and between modules and the core Redis data structures to be executed in a fully programmable and distributed manner, while maintaining instant, sub-millisecond latency. This multi-model operation enables developers to work natively with data models in the engine and switch between them smoothly, offloading costly development effort in their own applications. It also enables them to use this NoSQL engine for a variety of application types, ranging from recommendation engines to financial analysis.

RedisAI

One of these data modules focuses on a popular cloud computing trend: artificial intelligence. RedisAI integrates an inference engine inside the database layer, drastically reducing latency. Rather than handling the actual training of the data, it focuses on the inference process, in which the AI algorithms assess new data against trained statistical models.

Redis Labs designed RedisAI to focus on performance. As a native module integrated with the database, it runs closer to where the data is, reducing its latency. It caches both the data and the model used to assess it, meaning that when you analyse raw data, you rarely have to go back to the separate data repository serving the underlying model. The system can also use either GPUs or CPUs to handle the inference process, increasing performance still further. In tests, Redis Labs says that it found performance throughput ranging between four and nine times those of other AI models.

Redis Labs is building RedisAI in anticipation of a surge in AI-powered applications. The company envisages it being used in sectors that need to process data including images, video, and audio, among other things. Because of the multi-model interaction, you could take time series data and run AI-powered predictive analytics on it, for example.

Redis Labs sees Gartner's positioning of Redis Labs as a Challenger in cloud databases as a highlight of the company’s immediate leadership potential. This resonates with other findings in the industry. In its 2020 annual developer survey, Stack Overflow highlighted the open-source database as the most-loved database among developers for the fourth year in a row. The numbers are stacking up nicely for the 11-year veteran database this year. What accolades will 2021 hold?

Sponsored by Redis Labs


Biting the hand that feeds IT © 1998–2021