mi stream imgc2

7 1Mi-STREAM - Data Processing & Storage; Device & Information Management
Cutting edge stream processing of real-time data feeds from thousands of data sources.

 

WHAT IS MI-STREAM

 

The Affectli Mi-Stream module consumes various data streams, processes them and produces events that can then be sent to Affectli, where they can optionally trigger business processes, cases or scripted logic.

 

The basic implementation consists of the following services, deployed, managed and scaled by Kubernetes:

 

1. Big Data Management service
2. Data Warehouse
3. Cluster-computing framework
4. Distributed file system. Apache
5. Dataflow Orchestration Service
6. Search service
7. Stream Processing Service

 Company Affectli Infographic Mi stream 2

 

The above diagram shows how we make data available to any BI tool, by allowing Data Analysts to prepare the denormalized datasets from normalised data into a data storage system of choice.

 

The Affectli Mi-Stream service consists of two parts:

 

The Affectli Mi-Stream connector is a component used in cases where the devices that generate data and require a secure communication channel. It then collects data/signals from devices available on the network, then encrypts and forwards the data to the Affectli Mi-Stream Processing cluster. The Affectli Mi-Stream connector can collect data using a vast number (over 200) of protocols. In addition, protocols that are not supported out-of-the -box can be included as custom implementations.

 

The Affectli Mi-Stream Processing Cluster normalizes and transforms incoming data/signals so that it can be further processed through a rules engine. The rules engine can consist of a simple rule set or can use advanced machine learning and AI algorithms, depending on customer requirements, to generate events that are then consumed by the Affectli platform.
The Affectli Mi-Stream connection and Processing cluster can easily be configured with minimal to zero programming required.

 

HOW DOES IT FIT INTO AFFECTLI

 

Affectli uses Mi-C3’s Mi-Stream application, which is capable of collecting protocol data units from various protocols running on corresponding interfaces from multi-technology and multi-vendor networks using passive probes. It is also capable of collecting data from different planes on a remote or network management system.

 

WHO USES MI-STREAM

 

On deployment of Affectli, the Mi-Stream is the integration module of the platform. Business decisions are programmed into mi-stream during the configuration process. Mi-Stream is a cutting edge stream processing engine of data feeds. It collects and processes all the data feeds. The Mi-Stream service gives you a unique understanding into what is happening by correlating the data from your sources. Business processes are pro-actively run through the stream, to catch problems before they happen, and to react to issues as they arise.

The main view for Mi-Stream is the network overview which shows a map of all your sources, showing whether they are active or not, connected or disconnected or maybe if they are down or there are some faults. The network overview is built by adding sources (as explained further on), and allows the user to quickly move and manage his sources by drag and drop.

 

WHY IS MI-STREAM AN INTEGRAL PART OF THE PLATFORM

 

Affectli is entirely network, device and vendor-independent. It brings with it no specific requirements on the nature or source of signals or systems it integrates with and is entirely ambivalent about source specifications which, if open to push data to Affectli’s API, are seamlessly folded into the information flowing into the Platform.

The Affectli Platform allows any third party suppliers engaging with the business to integrate fully and seamlessly within the Client’s space. Using the Affectli API, suppliers can plug into their customers’ Affectli space empowering the Client with a single and seamless view of their operations no matter where that information is coming from.

Affectli does not only integrate easily with systems that feed into it but also contributes to other organisation-wide systems that are in existence at the time of implementation or are introduced in the lifetime of the Affectli use license. Affectli has been particularly successfully integrated with ERP systems (such as Microsoft Dynamix AX or SAP), specialist BI platforms (including Oracle and Pentaho) and third party databases (such as MySQL and PostgreSQL). Affectli is designed for integration with any other system that allows it to do so, including business support systems and customer relationship management systems.

 

Company Affectli Infographic Mi stream 3

 

WHEN DOES THE MI-STREAM BENEFIT BUSINESS

 

The platform will manage data, control and configuration of the devices and your network, automatically report on faults, dispatch and manage a technical team up to satisfactory completion whilst reporting to essential staff and customers all along.
Affectli Nifi is a feature-rich data lake platform built on Apache Hadoop and Apache Spark. Affectli Nifi provides a business-friendly data lake solution and enables self-service data ingestion, data wrangling, data profiling, data validation, data cleansing/standardization, and data discovery. Its intuitive user interface allows IT professionals to access the data lake without having to code interfaces themselves.

 

HOW DOES IT WORK FROM A USER PERSPECTIVE

 

All 3rd party inputs are received into the Affectli Mi-Stream server of Mi-C3 and based on defined business rules only certain record types will be passed onto the Affectli server for further processing.
The BI reporting facility of Affectli is able to draw information from both Affectli as well as the data stored within the Affectli Mi-Stream server.
Affectli Mi-Stream is one of the few platforms that allow your organisation to read live streaming signals, data batches, messages, alarms and status data directly from equipment in the field to trigger and start high level business processes.


The features of this stack is the ingestion, preparation, discovery, monitoring and design of data.


Affectli contains a high level graphics interface that allows a user direct access to the full features and complex structures of Nifi without having to code solutions.


Customers can export signals from a variety of formats into a JSON format that connects directly to Affectli Mi-Stream business flow engine. Affectli Nifi is built to deal with data flow, SOA , API’s and IOT devices and low level data device data in a seamless manner.


Affectli Nifi will ingest data, monitor the status of communications links (feeds), networks and data flow over a wide range of sensors and systems.


The screenshot below shows events that have been collected from the Mi-Stream, and alerts are monitored as they are received. The intervals of these alerts are set from anything to every 30 seconds to any other time interval relevant to the client.

mi stream01

Incoming signals and alarms are linked to processes and tasks. A user is able to make the issue/signal as “seen and done” by clicking on the tick. Or the user is able to delete the alarm by clicking on the cross. It is also possible for a process to be automatically initiated, by clicking on the process icon. This is particularly efficient, as this process could mean that a technician is requested to be deployed to a certain site or it could be that a certain part needs to be ordered. The information is already in the alarm, and therefore the process to get a technician on site is instant, because the fault information is already at hand. The user is also able to see what tasks are in progress (in the A-Box) by clicking the A-Box icon.

In the screenshot below, a process was started by clicking on the START PROCESS icon. This immediately starts the process for the event.

mi stream02

Affectli supports north-bound integration with open interfaces. Mi-Stream ingests directly from sensors, gateways, queues, RDBMS, Hadoop, SFTP servers, social media feeds etc. to name a few. It can ingest both streaming data which is real time ingestion, and batch data, which is ingestion of bulk data.