N5 Technologies
Last Updated:
Analyst Coverage: Philip Howard and Daniel Howard
Rumi from N5 Technologies, Inc. addresses the need of modern enterprises to derive business insights in real-time from massive volumes of historical and live data while integrating this capability directly into their customer serving transactional applications. This requirement is discussed in more detail here, which was compiled before N5 came to market with its Rumi product.
N5 and its founders originated from the financial services sector and Rumi was developed to power mission critical, real-time risk management applications and ultra-low latency equity trading systems. The company is based in San Jose, California, and, as of the time of this writing it is privately funded. The company’s main route to market is via direct sales. It plans to selectively partner with System Integrators and industry vertical VARs. Product pricing is data-driven and subscription based. Applications can be deployed in public and private clouds, on-premises data centres and edge data centres. It can be configured on virtualized or bare-metal infrastructure. The current offering supports Java based applications and other programming language support is on the roadmap.
N5 Rumi
Last Updated: 11th February 2021
Mutable Award: Platinum 2021
Rumi is a software platform that enables enterprises to embed rich, real-time analytical data processing directly into their transactional applications. Specifically, it allows you to develop and run custom developed analytics on and across large volumes and varieties of stored raw and/or pre-analysed data, together with live streaming data, in real-time. The results can then be presented in a contextually relevant manner. Not only does it do this in real-time with very low latency it does so (according to N5) within predictable time limits. And finally, Rumi has been designed to support the sort of resiliency that enterprises require to support mission critical business process.
To support these capabilities within a single environment (competitive vendors tend to require two, three or more products to do this) Rumi’s architecture is based on a multi-node massively scalable and resilient distributed processing system. Each node functions as a fault tolerant, highly available and elastically scalable micro dataservice. Each service houses data with co-located business logic and publishes integrated telemetry for monitoring and diagnostics. A service is independently capable of big data storage, fast data streaming, CRUD and/or analytical data serving. The stream processing and analytical data serving logic is implemented by the service’s business logic. This logic is activated by stream messages published by upstream nodes and service requests issued by the service’s clients. The system is horizontally scaled by sharding micro dataservices and by deploying multiple concurrently executing micro dataservices interconnected via fire-and-forget message passing provided natively by Rumi, or over commodity messaging. The overall system of interconnected services is configured and managed as a single distributed deployment. Such a system of interconnected services (illustrated in Figure 1) is called the Micro DataService Fabric (MDF).
The key innovation of Rumi is the MDF on which the application and data are deployed together for execution. High level functionality of MDF is illustrated in Figure 2. The MDF combines in-memory distributed data storage, data streaming, real-time business logic execution and analytical data serving with a microservices based application architecture. This optimises the platform to enable highly concurrent and scalable big and fast data processing with predictable (and claimed ultra-low) latency performance. NoSQL-based data modeling is used to make it flexible and efficient to support hybrid application access patterns.
Customer Quotes
“Personalized pricing implemented using N5 was instrumental in a 25x increase in loyalty-based revenue.”
“Using N5 resulted in over $50M in cost reduction. This included not just hardware, network and software savings, but also reduced support and trading risk related cost.”
Rumi has been designed to process massive volumes and varieties of data in real-time and scale such processing as data volumes grow. By co-residing application and data together for execution, Rumi is designed to provide high performing analytical data processing and serving. It also enables enterprises to rapidly develop and deploy applications that embed analytics with transactional processes. The MDF allows developers to focus on the application’s business logic, as complex as needed, whether transactional or analytic or a combination of the two, while the fabric manages all non-functional application details. Moreover, by co-residing application and data together for execution, Rumi provides a single processing environment with a single code base. You can also integrate with AI/ML execution platforms if required.
There is an increasing demand for hybrid application environments. Example use cases include hyper-personalization of offers for e-commerce systems, dynamic pricing, ad bidding, real-time payment processing, credit card fraud detection and prevention, real-time data curation for inference data for ML/AI models, processing of fast data streams for risk and order management for high volume equity trading, real-time social media data curation based on user preferences, eGaming real-time personalisation, and several industrial automation, IoT and 5G enabled applications. And the key point about Rumi is that you can support all of these from within a single development and deployment environment. Competitive offerings tend to require the use of multiple products which potentially means significant extra costs and, in many cases, reduced business capability.
Further to this point, Rumi natively provides a highly available (automatic hot switch-over to an alternate node), fault-tolerant (with site replication for disaster recovery) and massively scalable environment to support your hybrid applications, so it offers the sort of resiliency that enterprises will require. Moreover, it simplifies application design and development complexity, by depending on the platform to provide and manage enterprise level capabilities. As a result, the development and deployment of applications should be easier and more agile than would otherwise be the case.
And finally, Rumi’s architecture has been designed to support high throughput performance combined with (ultra-)low latency. And this, in conjunction with the other benefits mentioned should have a net positive effect on total cost of ownership.
The Bottom Line
Rumi offers the benefits of a new approach, designed and developed from the ground up, specifically to support hybrid transactional/analytic application development and execution – scaling concurrently the applications, computing and data storage, streaming and serving capabilities – all within a single product/environment. That’s a very powerful combination and one that merits detailed consideration.
N5 Rumi (Streaming Analytics)
Last Updated: 14th December 2021
Mutable Award: One to Watch 2021
Rumi is a software platform that enables enterprises to embed rich, real-time analytical data processing directly into their transactional applications. Specifically, it allows you to develop and run custom developed analytics on and across large volumes and varieties of stored raw and/or pre-analysed data, together with live streaming data, in real-time. The results can then be presented in a contextually relevant manner. Not only does it do this in real-time with very low latency, it does so within predictable time limits. And finally, Rumi has been designed to support the sort of resiliency that enterprises require to support mission critical business process. In short, it addresses the need of modern enterprises to derive business insights in real-time from massive volumes of historical and live data while integrating this capability directly into their customer serving transactional applications. Further, Rumi can be deployed in public and private clouds, on-premises data centres and edge data centres, and can be configured on virtualized or bare-metal infrastructure. It currently supports Java applications, with other language support on the roadmap.
Customer Quotes
“1 million equity orders per second in under 10 microseconds client-market with zero loss.”
Leading NY hedge fund
“$50M in cost reduction in year-1. This included hardware, network and software savings, along with reduced support cost and trading risk related cost.”
Global Fortune 500 investment bank
“N5 provided high availability with performance that we could not get anywhere else.”
Global Fortune 500 investment bank
Rumi’s architecture is based on a multi-node, massively scalable and resilient distributed processing system. Each node functions as a fault tolerant, highly available and elastically (and linearly) scalable Micro DataService (see Figure 1). Each service houses data with co-located business logic and publishes integrated telemetry for monitoring and diagnostics. A service is independently capable of big data storage, fast data streaming, CRUD and/or analytical data serving. In other words, it provides comprehensive, native, stream processing capabilities within each and every node.
The stream processing and analytical data serving logic is implemented by the service’s business logic. This logic is then activated by stream messages published by upstream nodes and service requests issued by the service’s clients. The system is horizontally scaled by sharding Micro DataServices and by deploying multiple concurrently executing Micro DataServices interconnected via fire-and-forget message passing provided natively by Rumi, or over commodity messaging. The overall system of interconnected services is configured and managed as a single distributed deployment. Such a system of interconnected services is called the Micro DataService Fabric (MDF), as shown in Figure 2. Note also that this kind of system can be implemented incrementally using microservices.
The MDF is the key innovation for Rumi, on which the application and data are deployed together for execution. It combines in-memory distributed data storage, data streaming, real-time business logic execution and analytical data serving with a microservices based application architecture. This optimises the platform to enable highly concurrent and scalable big and fast data processing with predictable (and ultra-low) latency performance. NoSQL-based data modelling is used to make it flexible and efficient to support hybrid application access patterns.
Rumi has been designed to process massive volumes and varieties of data in real-time and scale such processing as data volumes grow. Exactly-once processing is built-in. Furthermore, by co-residing application and data together for execution, Rumi provides high performing analytical data processing and serving. It also enables enterprises to rapidly develop and deploy applications that embed analytics with transactional processes. The MDF allows developers to focus on the application’s business logic, as complex as needed, whether transactional or analytic or a combination of the two, while the fabric manages all non-functional application details. Moreover, by co-residing application and data together for execution, Rumi provides a single processing environment with a single code base. You can also integrate with AI/ML execution platforms if required.
There is an increasing demand for hybrid application environments. Example use cases include hyper-personalization of offers for e-commerce systems, dynamic pricing, ad bidding, real-time payment processing, credit card fraud detection and prevention, real-time data curation for inference data for ML/AI models, processing of fast data streams for risk and order management for high volume equity trading, real-time social media data curation based on user preferences, eGaming real-time personalisation, and several industrial automation, IoT and 5G enabled applications. Rumi enables you to support all of these from within a single development and deployment environment. Competitive offerings tend to require the use of multiple products, which potentially means significant extra costs and, in many cases, reduced business capability and agility.
Further to this point, Rumi natively provides a highly available, fault-tolerant and massively scalable environment to support your hybrid applications, so it offers the sort of resiliency that enterprises will require. Moreover, it simplifies application design and development complexity, by depending on the platform to provide and manage enterprise level capabilities. As a result, the development and deployment of applications should be easier and more agile than would otherwise be the case.
Finally, Rumi’s architecture has been designed to support high throughput performance combined with (ultra-)low latency. It is more than capable of processing large quantities of streaming data at very high speeds, in large part because it is able to distribute both data and compute responsibilities intelligently across its architecture. This, in conjunction with the other benefits mentioned, should have a net positive effect on total cost of ownership.
The Bottom Line
Rumi offers the benefits of a new approach, designed and developed from the ground up specifically to support hybrid transactional/analytic application development and execution within a single product/ environment. The results certainly seem potent, both for the streaming space and
in general.