Krawler is a minimalist Extract, Transform, Load (ETL) software used when data needs to be:

  • extracted from heterogeneous data sources (e.g. databases or web services)
  • transformed in a target format or structure for the purposes of querying and analysis (e.g. JSON or CSV)
  • loaded into a final target data store (e.g. a file system or a database).

It is built around the concept of a pipeline: a set of processing functions connected in series where the output of one function is the input of the next one. Beyond standard processing functions available out-of-the-box, custom functions and pipelines can be setup to power your business processes up.