Fan-Out Quickstart (Local)¶
This example shows how to run the same pipeline multiple times with different parameters (fan-out) using the built-in Float demo processors.
What you’ll run¶
For each (value
, factor
) pair:
FloatValueDataSource emits
value
.FloatAddOperation adds a constant
addend
(1.0
).FloatMultiplyOperation multiplies by
factor
.FloatCollectValueProbe captures the computed float into the context under the key
result
(this mirrors the pipeline’s inspected output so it can be used downstream).A string builder context factory creates a filename using all fan-out inputs and the computed
result
(pattern:{value}_plus_{addend}_times_{factor}_equals_{result}.txt
). That filename is placed into the context as a path for the saver.FloatTxtFileSaver reads the path from the context and writes the textual result to disk; one file is produced per fan-out run.
Notes and rationale
- Multi-parameter ZIP fan-out: the multi
block now supplies three lists (value
, addend
, factor
). The engine zips them and runs one pipeline per index; equal lengths are enforced.
- Supplying parameters via context: the demo processors do not declare these parameters explicitly — the fan-out injects value
, addend
and factor
into the run context and processors pick them up automatically. This lets you vary any number of parameters without changing node parameter blocks.
- Filename building: using a stringbuild context factory (registered as a context-producing processor) lets you assemble filenames/IDs from runtime values cleanly and portably.
- Execution and tracing: the default local orchestrator runs jobs serially. The trace driver (jsonl) writes a SER file under ./ser/
containing per-run pinned arguments (e.g., fanout index and the values used) so runs can be inspected or replayed later.
The final result is: (value + 1.0) x factor
.
YAML¶
# docs/source/examples/fanout_floats.yaml
# Fan-out (local): FloatValueDataSource -> FloatAddOperation -> FloatMultiplyOperation -> FloatValueProbe -> FloatTxtFileSaver
# For each (value, added, factor) tuple, run the pipeline serially (default local orchestrator).
extensions: ["semantiva-examples"]
pipeline:
nodes:
# 1) Source: expects 'value' (supplied by fan-out via context)
- processor: FloatValueDataSource
# 2) Add: use a constant addend to keep the example clear
- processor: FloatAddOperation
parameters:
# 3) Multiply: expects 'factor' (supplied by fan-out via context)
- processor: FloatMultiplyOperation
# 4) Collect float value into the context
- processor: FloatCollectValueProbe
context_keyword: "result"
# 5) Build a filename string using fan-out parameters
- processor: stringbuild:"{value}_plus_{addend}_times_{factor}_equals_{result}.txt":path
# 6) Write the result to a text file
- processor: FloatTxtFileSaver
trace:
driver: jsonl
options:
detail: all
output_path: ./ser/
fanout:
# Multi-parameter ZIP fan-out (equal lengths enforced)
multi:
value: [1.0, 2.0, 3.5]
addend: [1.0, 1.0, 1.0]
factor: [10.0, 20.0, 30.0]
Key ideas¶
Fan-out (ZIP): the
multi
block provides two lists of equal length (value
andfactor
). The engine creates one run per index.Parameter resolution from context: the nodes do not declare
value
orfactor
in their parameters; these are injected into the context by fan-out and picked up automatically by the processors.Local execution: no
execution:
block is needed; the default local orchestrator runs each job serially.SER: A JSONL file is written into the
./ser/
directory — the trace driver creates one JSONL file per pipeline run (one file per fan-out job) containing per-run pinned arguments (e.g.fanout.index
,fanout.values
).
Next steps¶
To run on Ray, add an
execution:
block selecting the registered Ray orchestrator/executor/transport (see Pipeline Configuration Schema).To build filenames or IDs from fan-out values, use the string builder context factory (see Context Processors).