Deliver instant predictive intelligence with the combination of Redis-ML and Spark ML
MOUNTAIN VIEW, Calif, October 4, 2016 at 04:58 a.m. PT
Today, Redis Labs, the home of Redis, introduced an open source project Redis-ML, the Redis Module for Machine Learning that accelerates the delivery of real-time recommendations and predictions for interactive apps, in combination with Spark Machine Learning (Spark ML).
Machine learning is fast becoming a critical requirement for modern smart applications. Redis-ML accelerates the delivery of real-time predictive analytics for use cases such as fraud detection and risk evaluation in financial products, product or content recommendations for e-commerce applications, demand forecasting for manufacturing applications or sentiment analyses of customer engagements. Spark ML (previously MLlib) delivers proven machine learning libraries for classification and regression tasks. Combined with Redis-ML, applications can now deliver precise, re-usable machine learning models, faster and with lower execution latencies.
“The combination of Apache Spark and Redis simplifies and accelerates the implementation of predictive intelligence in modern applications,” said Ram Sriharsha, product manager for Apache Spark at Databricks. “This latest release from Redis Labs is a great example of Spark’s growth and maturity in enterprise machine learning applications.”
“The Redis-ML module with Apache Spark, delivers lightning fast classifications with larger data sizes, in real-time and under heavy load, while allowing many applications developed in different languages to simultaneously utilize the same models,” states Dvir Volk, senior architect at Redis Labs. “The Redis-ML module is a great demonstration of the power of Redis Modules API in supporting the cutting-edge needs of next generation applications.”
Redis-ML enriches Spark ML in the following areas:
- Faster Prediction Generation: Storing and serving your trained Spark Machine Learning models directly from Redis, parallelizes access to the models and significantly improves performance. Initial benchmarks showed 5x to 10x latency improvement over the standard Spark solution in real time classifications.
Redis-ML avoids the need to generate the model from file systems or other disk based data stores, a process which usually involves long serialization/deserialization overheads with slow disk accesses. With Redis-ML, at the end of the training phase, the model is just stored in its native format in Redis.
- Consistent Prediction Delivery: As user traffic grows, it is important to guarantee real-time recommendations and predictions at a consistent speed to the end user. With Redis-ML, recommendations and predictions are delivered at consistent speed no matter how many concurrent users are accessing the model.
- Greater Interoperability: Redis-ML provides great interoperability for all languages including Scala, Node, .Net, Python and more. With Redis ML, your models are no longer restricted to the language they were developed in, they can be accessed by applications written in different languages concurrently using the simple API.
- Scaling Machine Learning Models: Delivering predictions with better precision requires larger machine learning models. Existing solutions cannot hold the model in-memory when it grows beyond the memory available in a single node. This immediately reduces performance and triggers the serialization/serialization to disk and performance suffers. The Redis-ML module takes full advantage of Redis Labs’ in-memory distributed architecture to scale the database to any size needed in a fully automated manner without affecting performance.
- Simplified Deployment: Once the models are ready, Redis-ML makes it easy to obtain recommendations or predictions for the application using simple APIs, without having to implement custom recommendation/prediction generation code or setting up a highly available and scalable infrastructure that supports it.
- Higher Availability: Training new models can be done offline. However, reliably delivering real-time predictive intelligence is critical for modern applications. The Redis-ML module, deployed with Redis Labs’ technology delivers always-on availability that protects against process, node, rack or data center failures with instant automatic detection and failover.
Join Redis Labs at Big Data London on Nov. 4, 2016 to hear more about Redis Module for Machine Learning (Redis ML) with Spark Machine Learning (Spark ML).
The open source Redis Module for Machine Learning (Redis ML) is available at https://github.com/RedisLabs/spark-redis-ml.
Create additional modules to solve modern data challenges at the Redis Module Global Hackathon, registration for which is now open, with submissions concluding on Nov. 12th. The event is expected to bring together over 500 teams from around the world online and in the associated onsite hackathons in San Francisco and Tel Aviv. Participants will be eligible to win up to a total of $10,000 in cash prizes. Grand prize winners will be announced on Nov. 17th.
For more information on Redis Labs, visit: redislabs.com.
About Redis Labs
Redis Labs is the open source home and provider of enterprise class Redis, an in-memory NoSQL database platform benchmarked as the world’s fastest. Thousands of customers rely on Redis Labs’ high performance, seamless scalability, true high availability, versatility and best-in-class expertise to power their cutting edge applications. Redis Labs’ software and database-as-a-service solutions enhance popular Redis use cases such as real-time analytics, fast high-volume transactions, in-app social functionality, and application job management, queuing and caching.