Skip to main content

Solution Feature

AI Pipelines

AI pipelines define data analyses on live data streams of machine data. Instead of just connecting graphical blocks using drag-and-drop, more complex monitoring and data harmonization can also be generated simply using language as explainable code.

Request a demo

AI pipelines in productive use

Analyze live data streams with language-based code

AI pipelines are designed for live data streams. They define monitoring, harmonization steps and analytical logic directly on running machine data instead of having to work with exports or individual scripts later.

The main difference to classic pipelines is that complex logic is not just composed of reusable graphical blocks. Instead, a user can use language to describe which live analysis or harmonization is required. This is particularly helpful when graphical blocks become too rigid or too complex for the actual task.

The platform generates code from this in suitable programming languages, executes it on high-frequency data streams and makes the results available again as live data in the platform. This makes even more sophisticated evaluations much more accessible in everyday life.

Classification

When AI pipelines make sense

AI pipelines are designed for continuous monitoring, automated reactions and the continuous processing of machine data.

If a graphical pipeline is too simple for the required logic or if results are to flow directly back into running processes, AI Pipelines is the more suitable option. For exploratory one-off analyses and the examination of stored data, on the other hand, AI Notebooks are more suitable.

Prompt-driven live analytics

Classic pipelines vs. AI pipelines

Compare directly in the workflow

Classic pipelines consist of graphical drag-and-drop blocks. AI pipelines supplement this approach with language-based code generation for more complex live analyses, monitoring and data harmonization.

Request a demo
Classic pipeline editor
AI pipeline with model steps
Graphic pipelineAI Pipeline

What AI pipelines do in operation

From the simple definition to the feedback of results into current data streams.

Definition
By language

Complex live monitoring and data harmonization can also be defined by voice instead of drag-and-drop.

Code
Explainable

The generated code remains visible, can be explained and is therefore traceable instead of a black box.

Execution
Live

The generated analyses can be performed on high-frequency live data streams.

Results
Further usable

The results are available again as live data streams in the platform and can trigger further actions.

Entry into a new AI pipeline

Define live analyses instead of just graphical blocks

In addition to classic graphical pipelines, complex monitoring or data harmonization can be described directly using language. This results in code instead of modeling limited to blocks.

AI Pipeline Builder

Generate and understand code in suitable languages

AI pipelines generate executable code in suitable programming languages. The code remains visible, can be explained and can be checked professionally and technically before productive use.

AI pipelines in a productive overview

Generate test data and return results as live data

AI-supported test data can be generated to check the pipeline. The results of the execution are then available again as live data streams in the platform.

Fields of application

Typical fields of application in production

AI pipelines are designed for production-related tasks that specialist users understand directly, such as anomaly detection on live data streams, temperature monitoring, condition monitoring, pattern recognition in signals or the harmonization of heterogeneous machine values.

The live results returned can then be used in other pipelines to address data sinks, execute actions or send notifications to Microsoft Teams, for example.

Demo

Experience Bytefabrik live

Arrange a demo to get to know the platform, analysis and AI functions based on your questions.

Book a demo