Redis Labs Delivers Powerful Data Platform for Next Wave of AI Applications

RedisAI elevates inferencing performance and efficiency up to 10x by serving models within Redis, where the reference data lives

RedisGears is a serverless engine for infinite data-processing options at milliseconds speed across Redis data-structures, modules, and cluster nodes

Mountain View, May 19, 2020—Redis Labs, the home of Redis and provider of Redis Enterprise, today announced the general availability of RedisAI and RedisGears, previewed last week at RedisConf 2020 Takeaway. Together, RedisAI and RedisGears transform Redis Enterprise into a low-latency, real-time data platform for infinite processing capabilities across any data model, while simultaneously inferencing artificial intelligence (AI) requests all within Redis. 

According to Gartner1, “Through 2024, the shift from pilot to production artificial intelligence (AI) will drive a 5x increase in streaming data analytics infrastructure.” The report further states, “Getting AI into production requires IT leaders to complement DataOps and ModelOps with infrastructures that enable end-users to embed trained models into streaming-data infrastructures to deliver continuous near-real-time predictions.”

“We’ve heard the challenges customers have as they move AI into production, in particular, the end-to-end AI serving time, which in many cases was influenced by the time it takes to collect, prepare and feed the data to the AI serving engine. RedisAI and RedisGears were designed to solve this problem, by reducing the end-to-end AI serving time to milliseconds,” said Yiftach Schoolman, CTO and co-founder at Redis Labs. “With Redis Enterprise as the perfect high-performance and scalable foundation, RedisAI and RedisGears will enable our customers to successfully utilize AI technologies to create operational efficiencies and solve real business problems in real-time.”

RedisAI: Delivering predictions that help the business win

RedisAI (co-developed by Redis Labs and Tenserwerk) is the answer to the challenge every architect or developer faces as they attempt to design and implement AI in their production applications: the time spent outside the AI inference engine to collect and prepare the reference data. With the AI serving engine inside Redis, RedisAI reduces the time spent on these external processes and can deliver up to 10x more inferences than other AI serving platforms and at a much lower latency. Many of the leading AI-driven applications such as fraud detection, transaction scoring, ad serving, recommendation engine, image recognition, autonomous vehicles, and game monetization will achieve dramatically better business outcomes with these performance improvements. 

Integrated with MLflow, RedisAI eases the management of the AI lifecycle by allowing running models to be updated seamlessly and without downtime. With built-in support for major AI backend systems (TensorFlow, PyTorch, and ONNX Runtime), RedisAI allows inferencing to be run across platforms. For instance, a model that was trained by PyTorch can be inferenced by TensorFlow. Finally, in combination with Redis Enterprise’s high availability, linear scalability, and flexible deployment model, RedisAI can be deployed anywhere: cloud, on-premises data-centers, edge, and even on Raspberry-Pi devices.

RedisGears: Customize Redis to build better applications

RedisGears is a new serverless engine for infinite programmability options in Redis. RedisGears enables transaction, batch, and event-driven operations to be processed in milliseconds by performing these functions close to the data—all within Redis. RedisGears enables Redis to operate as a single data service by doing all the heavy lifting of managing updates to existing database and data warehouse systems.

Furthermore, RedisGears allows developers to perform any type of operation across Redis’ data structures and modules. RedisGears functions can be developed once and used across any type of Redis deployment: open-source, Redis Enterprise software, or Redis Enterprise Cloud.

1 Gartner, “Predicts 2020: Artificial Intelligence Core Technologies,” Chirag Dekate, Saniye Alaybeyi, Alan Priestley, Daniel Bowers, 11 December 2019.

Additional Resources

Read the Redis Labs blog about the challenges companies face with AI inferencing, or for more information on RedisAI and RedisGears.

Also watch the keynote, RedisAI, and RedisGears sessions from RedisConf 2020 Takeaway.


About Redis

Data is the lifeline of every business, and Redis helps organizations reimagine how fast they can process, analyze, make predictions, and take action on the data they generate. Redis provides a competitive edge to any business by delivering open source and enterprise-grade data platforms to power applications that drive real-time experiences at any scale. Developers rely on Redis to build performance, scalability, reliability, and security into their applications.

Born in the cloud-native era, Redis uniquely enables users to unify data across multi-cloud, hybrid and global applications to maximize business potential. Learn how Redis can give you this edge at redis.com.