-
Notifications
You must be signed in to change notification settings - Fork 22
Getting started
🧑💻 The files and code excerpts mentioned below are available in the Zenoh-Flow repository.
Table of Contents
Now that you have a working installation, let us create our first Zenoh-Flow application.
As is traditional in computer science, we will start with a simple "Hello, World!": we read a chain of characters and replace "World" in the previous sentence with what we read. For instance, if we read "Alice" our application should produce: "Hello, Alice!".
To launch a Zenoh-Flow application we need the following:
-
A set of "compatible" types: as data can cross devices or programming languages, a common representation is required.
-
A set of nodes: descriptors paired with shared libraries that implement Zenoh-Flow's interface.
-
The application descriptor: it describes the structure of our application, i.e. the nodes involved, how they are connected and where to find their implementation.
Let's look at each in more detail.
In our Getting Started implementation, we are using Protobuf's representation of Strings through the prost! project.
There are multiple serialisation / deserialisation libraries available. Our choice was only motivated by familiarity with the library and we thus advise you to take the time to compare options and pick the one that best suits your needs.
Our application will (i) obtain our chain of characters from a Zenoh subscriber, (ii) process it, (iii) write the result in a file and (iv) publish it on Zenoh. We will thus have four nodes:
- A Zenoh subscriber: a Source.
- A processing node: an Operator.
- A node that logs the result to a file: a Sink.
- A node that publishes the result on Zenoh: another Sink.
The Source, Operator and Sink are the three types of nodes that Zenoh-Flow supports. A Source fetches data from outside an application, an Operator computes over data and a Sink sends data "outside" an application.
The main difference between these nodes is their Input
/ Output
ports:
- A Source only has
Output
ports. - An Operator has both
Input
andOutput
ports. - A Sink only has
Input
ports.
Let us see next how to write their descriptor and implement them.
A Source has to expose in its descriptor the following:
- An
id
: an identifier, unique within this data flow. - (optional) A
description
: a human-readable summary of what the Source does. - A
library
: where a Zenoh-Flow Daemon can find the shared library. - Its
outputs
: a list of identifiers. (⚠️ Twooutputs
cannot share the same identifier.)
To illustrate, the descriptor of the zenoh-sub
could be:
id: zenoh-sub
description: A Zenoh Subscriber.
library: file:///path/to/libzenoh_sub.so
outputs:
- hello
Then, to create a Source two traits must be implemented:
-
Source
: how to create an instance of the Source. -
Node
: the implementation of a single iteration. A Zenoh-Flow Daemon will call this method in a loop.
Note
Considering that Zenoh-Flow is tightly coupled to Zenoh, we did not provide an implementation of the zenoh-sub
Source and instead used a built-in Zenoh Subscriber. We cover this topic right after the other Nodes.
Similarly, an Operator has to expose:
- An
id
: an identifier, unique within this data flow. - (optional) A
description
: a human-readable summary of what the Operator does. - A
library
: where a Zenoh-Flow Daemon can find the shared library. - Its
inputs
: a list of identifiers. (⚠️ Twoinputs
cannot share the same identifier.) - Its
outputs
: a list of identifiers. (⚠️ Twooutputs
cannot share the same identifier.)
To illustrate, the descriptor of the greetings-maker
could be:
id: greetings-maker
# Do not forget to change the extension depending on your operating system!
# Linux -> .so
# Mac OS -> .dylib
library: file:///path/to/libgreetings_maker.so
inputs:
- name
outputs:
- greeting
To create an Operator two traits must be implemented:
-
Operator
: how to create an instance of the Operator. -
Node
: the implementation of a single iteration. A Zenoh-Flow Daemon will call this method in a loop.
You can browse our implementation of the Greetings Maker operator here.
Finally, a Sink has to expose:
- An
id
: an identifier, unique within this data flow. - (optional) A
description
: a human-readable summary of what the Operator does. - A
library
: where a Zenoh-Flow Daemon can find the shared library. - Its
inputs
: a list of identifiers. (⚠️ Twoinputs
cannot share the same identifier.)
To illustrate, the descriptor of the file-writer
could be:
id: file-writer
# Do not forget to change the extension depending on your operating system!
# Linux -> .so
# Mac OS -> .dylib
library: file:///path/to/libfile_writer.so
inputs:
- in
To create a Sink two traits must be implemented:
-
Sink
: how to create an instance of the Sink. -
Node
: the implementation of a single iteration. A Zenoh-Flow Daemon will call this method in a loop.
You can browse our implementation of the File Writer sink here.
In cases where your Source(s) or Sink(s) only subscribe or publish to specific key expressions, we provide built-in nodes for which we already implemented this functionality.
To use these built-in nodes, one only has to provide a special descriptor.
id: zenoh-built-in-subscriber
zenoh-subscribers:
hello: "zf/getting-started/hello"
# Multiple subscribers are supported, simply add one per line.
# hi: "zf/getting-started/hi"
id: zenoh-built-in-publisher
zenoh-publishers:
greeting: "zf/getting-started/greeting"
# Multiple publishers are supported, simply add one per line.
# welcome: "zf/getting-started/welcome"
The structure is the same for both: <identifier>:<key expression>
. The identifier
is used to connect the built-in nodes in the links section of the data flow descriptor.
And we're set! We have all the building blocks we need to describe our application.
Let us write the data flow step by step. We first need to give it a name:
name: hello-world
We then need to specify the nodes that compose it and where we can find their description. Each type of node has a dedicated section. For our example, the declaration can look like this:
vars:
TARGET_DIR: file://todo!
BUILD: release
DLL_EXTENSION: so
OUT_FILE: /tmp/greetings.txt
sources:
- id: zenoh-sub
zenoh-subscribers:
hello: "zf/getting-started/hello"
operators:
- id: greetings-maker
library: "file://{{ TARGET_DIR }}/{{ BUILD }}/examples/libgreetings_maker.{{ DLL_EXTENSION }}"
inputs:
- name
outputs:
- greeting
sinks:
- id: zenoh-pub
zenoh-publishers:
greeting: "zf/getting-started/greeting"
- id: file-writer
configuration:
file: "{{ OUT_FILE }}"
library: "file://{{ TARGET_DIR }}/{{ BUILD }}/examples/libfile_writer.{{ DLL_EXTENSION }}"
inputs:
- in
Note
The section vars
is a special section that Zenoh-Flow uses to do pre-processing (more specifically, string replacement). Every time Zenoh-Flow encounters two pairs of curly braces, also as known as mustache, it will replace them with the value associated with the variable enclosed inside. Thus, for the above declaration, every occurrence of {{ BASE_DIR }}
will be replaced with file://todo!
. See the dedicated page for more information.
To complete our application descriptor we need to specify how the nodes are connected: the links that exist. A link connects an Output
port to an Input
port.
links:
- from:
node: zenoh-sub
output: hello
to:
node: greetings-maker
input: name
- from:
node: greetings-maker
output: greeting
to:
node: file-writer
input: in
- from:
node: greetings-maker
output: greeting
to:
node: zenoh-writer
input: greeting
output
and input
in the links section must match what is declared in their respective YAML descriptor (the values throughout this guide are consistent).
That's it! The description of our application is complete, all that is left is to launch it.
Note
This section assumes that you have, at least, one Zenoh-Flow Daemon running on your network.
Provided that the paths, ports and links are correct in the data-flow.yaml
file (i.e. adapted to your machine), we can ask our running Zenoh-Flow daemon to launch our application:
cargo run --release -p zfctl -- instance create getting-started.yaml
This command will output the unique identifier associated to this instance of your data flow. You can then use it to query its status:
cargo run --release -p zfctl -- instance -n alice status <instance-uuid>
If you see an output akin to:
+----------------------------------+------------------------------------------+-------------------------------------------------------+
| Runtime | Instance State | Node |
+=====================================================================================================================================+
| 9d3d97157a1f5bf148b6e7bef57c0ec5 | Loaded on 2025-01-16T16:03:31.133698999Z | file-writer, greetings-maker, zenoh-writer, zenoh-sub |
+----------------------------------+------------------------------------------+-------------------------------------------------------+
It indicates that the data flow was correctly loaded on the Zenoh-Flow Daemon, whose unique identifier is 9d3d97157a1f5bf148b6e7bef57c0ec5
. However, loaded does not mean that the data flow is active, for that we need to tell the Zenoh-Flow Daemon to start it:
cargo run --release -p zfctl -- instance -n alice start <instance-uuid>
If you query again the status of the data flow you will see something akin to:
+----------------------------------+----------------------------------------------+-------------------------------------------------------+
| Runtime | Instance State | Node |
+=========================================================================================================================================+
| 9d3d97157a1f5bf148b6e7bef57c0ec5 | Running since 2025-01-16T16:16:51.723101999Z | file-writer, greetings-maker, zenoh-writer, zenoh-sub |
+----------------------------------+----------------------------------------------+-------------------------------------------------------+
It indicates this time that are data flow is active! Time to test it:
# Launch this command in a separate terminal
$ tail -f /tmp/greetings.txt
Now for some live interactions. As our Source, zenoh-sub
, is subscribed to "zf/getting-started/hello"
we need to publish values on this key expression. Several options are possible:
# If you have compiled the `z_put` example of Zenoh in debug
./zenoh/target/debug/z_put -k "zf/getting-started/hello" -v "Alice"
# If you have enabled the REST plugin of Zenoh
curl -X PUT -H "content-type:text/plain" -d 'Bob' http://localhost:8000/zf/getting-started/hello
The terminal should display the following, indicating that our application is behaving as expected:
Hello, Bob!
Hello, Alice!
To stop it, you can send the following command:
cargo run --release -p zfctl -- instance -n alice abort <instance-uuid>
-
Descriptors
-
Node Implementation
-
Advanced