Collecting, storing and visualizing the raw machine data is just a prologue to any serious IoT project. The economic value is driven by management decisions for optimizing the existing process efficiency and offering new end customer services.
The above big words are technically enabled by aggregation, processing and deep analysis of the device data and event streams. These operations follow a unique curve in every large solution: optimizing the storage of harvested crops is totally different from orchestrating the transactions in a smart grid or managing a fleet of forklifts.
Analytical capabilities of AggreGate span from simple alerting to advanced trainable units leveraging the machine learning for prediction and anomaly detection. Years on the market and thousands of projects delivered by our partners in all vertical markets have proved the platform to own a comprehensive toolset for every use case.
Machine Data Cooking
AggreGate acquires, parses, ingests, sorts, filters, transforms, aggregates, batches, streams, and analyzes data coming from the things – small and big ones, regardless whether they are health sensors or SDH network monitoring appliances. Then it depends: some data can be forwarded to the applications which better understand its value while the other is managed by the platform operators.
We’re used to designing our analytical modules without a single idea about the physical sense of data they might be handling in the future. The Platform’s device drivers abstract the values from peculiarities of communication protocols. At the next step, the data is injected into the unified model where it gets scaled and further transformed to ensure the equality of measurement units. Since then, it freely flows between all modules of the platform.
Big Data is Really Big
Data streams in IoT can be really overwhelming. A hundred of thousands of transactions per second can be a peak load for a large financial institution, but it’s just a typical measurement flow from a single oil refinery. We’ve spent years and years to optimize our framework for handling such streams on a single server and be able to do much more with clusters connected via the AggreGate’s distributed architecture.
AggreGate features the expression, query, flow and other languages that were designed for natural understanding of the normalized data coming from hardware devices and circulating inside the system. Integrated debugging and runtime environment greatly simplifies the advanced data processing required in present-day monitoring and control systems.
Discovering the patterns in structured and semi-structured data is a self-service feature of the platform. More than twenty algorithms and hundreds of parameters satisfy the typical expectations of data scientists. Visual workflows allow configuring, train, score and operate the models according to your custom logic. Incremental training and stream processing finalize the picture.
Time Series Analytics
AggreGate is especially good at analyzing the time series which is the most common data type in IoT. It can detect anomalies, predict the trend evolution and classify time series regardless of their physical meaning and a number of points in a dataset. Streaming support ensures the smooth processing of large datasets that will never fit in memory.
Event stream monitoring and processing capabilities include the filtering, sorting, aggregation, deduplication, masking, correlation, acknowledgment, enrichment, and root cause analysis. Most analytical tools support the event-driven behavior that can be triggered by external events or events generated by the user-defined object and process models.
IoT platform is not widely expected to have much more than gauges, maps, tables, and time series charts. AggreGate may teach a thing or two fully matching the best-in-class business intelligence systems in terms of building the feature-reach statistical, analytical and data mining interfaces. You can slice and dice your data discovering the new patterns and finding ways to save the long-rumored percent of power consumption or hours of machine downtime.
Focus on Outcome
Despite the platform doesn’t bring anything specific to your business on its own, it allows you to concentrate on delivering the value instead of solving infrastructural tasks. This drives a tangible ROI from the first weeks of your project and brings your service to the market within a quarter.
Object and process modeling engine allows creating the digital twins of physical assets and services. Models employ the business rules for making automatic control decisions upon important events. Every model can be bound to the devices, data sources or other models standing lower in a digital enterprise hierarchy.
It looks like you won't be able to submit the form due to problems with your provider. Please register and you will be able to submit any forms!