An advanced Data Architecture for Manufacturing 4.0

Leveraging A.I., IoT and Stream Processing to enable the Smart Manufacturing paradigm

On June 3rd we hosted our webinar “An advanced Data Architecture for Manufacturing 4.0” in collaboration with our Partner Radicalbit.

During the event, attendees had the chance to learn more about the potential of the combination of A.I., IIoT and Stream Processing: a perfect blend of cutting-edge technologies that can enable the Smart Manufacturing paradigm, allowing companies to forecast in run-time production behavior, predict instantly economic and timing impacts, analyze good functioning of the equipment and, last but not least, have the predictive maintenance status in (near) real-time.

More specifically, attendees discovered how it is possible to transform data into information as quickly as possible and interpret it correctly thanks to RNA, a DataOps & MLOps enterprise-grade end-to-end platform designed to combine streaming event analysis and A.I., simplifying and accelerating developments in Advanced Analytics projects and Machine Learning enabled Decision Support Systems.

With the adoption of these cutting-edge technologies, companies in the Manufacturing industry can become more efficient, less resource consuming, services oriented (instead of merely product oriented), fast adaptable to business needs, and act autonomously leveraging human interaction.

Thanks to Roberto Mariotti (Technical Presales Manager at Radicalbit), Davide Fiacconi (Data Scientist at Radicalbit) and Cosma Rizzi (International Business Development at Bitrock) for sharing with us all your expertise in the field, and giving useful insights on the main challenges and trend topics of Manufacturing 4.0.

If you didn’t have the chance to attend the webinar live, or if you want to go back through the slides that were shown, you can access the presentation at the following link:

If you want to access the webinar recording, or simply know more about RNA and Bitrock technology offering, send an email to

Davide Fiacconi, Data Scientist

Roberto Mariotti, Technical Presales Manager

Cosma Rizzi, International Business Development Manager

Read More
Turning Data at REST into Data in Motion with Kafka Streams

Turning Data at REST into Data in Motion with Kafka StreamsTurning Data at REST into Data in Motion with Kafka Streams

From Confluent Blog

Another great achievement for our Team: we are now on Confluent Official Blog with one of our R&D projects based on Event Stream Processing.

Event stream processing continues to grow among business cases that have been reliant primarily on batch data processing. In recent years, it has proven especially prominent when the decision-making process must take place within milliseconds (for ex. in cybersecurity and artificial intelligence), when the business value is generated by computations on event-based data sources (for ex. in industry 4.0 and home automation applications), and – last but not least – when the transformation, aggregation or transfer of data residing in heterogeneous sources involves serious limitations (for ex. in legacy systems and supply chain integration).

Our R&D decided to start an internal POC based on Kafka Streams and Confluent Platform (primarily Confluent Schema Registry and Kafka Connect) to demonstrate the effectiveness of these components in four specific areas:

1. Data refinement: filtering the raw data in order to serve it to targeted consumers, scaling the applications through I/O savings

2. System resiliency: using the Apache Kafka® ecosystem, including monitoring and streaming libraries, in order to deliver a resilient system

3. Data update: getting the most up-to-date data from sources using Kafka

4. Optimize machine resources: decoupling data processing pipelines and exploiting parallel data processing and non-blocking IO in order to maximize hardware capacity These four areas can impact data ingestion and system efficiency by improving system performance and limiting operational risks as much as possible, which increases profit margin opportunities by providing more flexible and resilient systems.

At Bitrock, we tackle software complexity through domain-driven design, borrowing the concept of bounded contexts and ensuring a modular architecture through loose coupling. Whenever necessary, we commit to a microservice architecture.

Due to their immutable nature, events are a great fit as our unique source of truth. They are self-contained units of business facts and also represent a perfect implementation of a contract amongst components. The Team chose the Confluent Platform for its ability to implement an asynchronous microservice architecture that can evolve over time, backed by a persistent log of immutable events ready to be independently consumed by clients.

This inspired our Team to create a dashboard that uses the practices above to clearly present processed data to an end user—specifically, air traffic, which provides an open, near-real-time stream of ever-updating data.

If you want to read the full article and discover all project details, architecture, findings and roadmap, click here:

Read More