Skip to content

Pipeline

craph edited this page Dec 12, 2020 · 1 revision

You said pipeline ?

Keep in mind previous example, I will reuse it now.

The heart of the processing is based on pipeline. It is similar to pipe operator in Unix system : every action do a basic operation and forward data to the following action :

you@computer:>cat /var/log/auth.log | head | grep "sudo"

feedoo do processing like this but add a tag to data. This way following action can decide to process the data (if it match) or just forward it to the next action. Tag is added by the data producer ("my_pipeline" in input_dummy) and other action will try to match (" * " in output_stdout). In the feedoo context, we call data Event. Indeed diffent : Event contains data, called record (dict), an unix timestamp and the tag.

Actions are categorized in four cases :

  1. input
  2. output
  3. filter
  4. parser
Clone this wiki locally