Skip to main content

About DaFT

DaFT is your all-in-one solution for ensuring that every data output is delivered in a consistent, normalised format—regardless of the source or style of the output. Our mission is to simplify and streamline the integration of your data with the leading monitoring and metrics systems, making complex data flows both accessible and actionable.

Overview

Modern data environments are becoming increasingly complex, with data streaming from a vast array of sources—from databases and files to web services and beyond. DaFT bridges this gap by providing a unified API that standardises data across diverse formats, ensuring reliable and secure integration with your existing tooling. Whether you are ingesting metrics from IoT sensors, logs from microservices, or performance data from extensive databases, DaFT empowers you to manage and visualise your data seamlessly.

What's New

Refactoring and Improvements

  • Enhanced Security:

    • Implements URL and file path validation to prevent SSRF and directory traversal attacks.

    • Utilises prepared statements and robust input sanitisation to secure your data pipelines.

  • Performance Enhancements:

    • Optimised data normalisation and filtering processes to improve throughput.

    • Introduces batch processing support in database handlers to manage large-scale data ingestion efficiently.

New Features

  • Extensible Architecture:

    • Leverages design patterns such as Factory and Strategy to allow the easy addition of new Data Handlers and Exporters, ensuring scalability as your data needs evolve.

  • Improved Format Support:

    • Expanded compatibility with JSON, XML, CSV, and Prometheus data formats.

    • Offers flexible querying and filtering capabilities to tailor data outputs to your specific requirements.

Supported Data Sources and Formats

DaFT is designed to work with a vast range of data sources—from traditional databases and time-series databases to modern log analytics tools such as Splunk and Athena. Our modular approach ensures that you can easily extend support to new formats as your business grows.

File and Web Formats

  • JSON Handlers:

    • Web JSON (type = web-json): Fetches JSON directly from a URL.

    • App JSON (type = app-json): Executes commands to return JSON output.

    • File JSON (type = file-json): Reads JSON from local files.

  • XML Handlers:

    • Web XML (type = web-xml): Fetches XML from a URL.

    • App XML (type = app-xml): Executes commands to provide XML data.

    • File XML (type = file-xml): Reads XML from a local file.

  • CSV Handlers:

    • Web CSV (type = web-csv): Fetches CSV data from a URL.

    • File CSV (type = file-csv): Reads CSV files from disc.

    • Standard Output CSV (type = stdout-csv): Executes commands and processes CSV output.

  • Standard Output:

    • (type = stdout): Executes commands and processes the standard output stream.

Use Cases: Polling, Querying, and Visualising Data

DaFT’s flexibility opens up a diverse range of applications that empower organisations to make data-driven decisions:

1. Multi-Location Data Aggregation

Imagine you are overseeing a network of IoT sensors scattered across different regions. Each sensor transmits data in various formats such as JSON or CSV. DaFT can poll data from these multiple endpoints in real time, normalise it, and prepare it for ingestion into a time-series database like Prometheus. This enables you to perform lightweight queries on the aggregated data—bringing together disparate data points into a cohesive overview.

2. Infrastructure and Application Monitoring

In dynamic IT environments, applications and microservices often generate logs and metrics that are stored across different systems. DaFT supports numerous data sources, allowing you to pull application logs, system metrics, or performance statistics. By standardising these outputs, DaFT ensures you can feed Prometheus with reliable, uniform data. With Prometheus handling lightweight queries to identify trends or anomalies, you can then visually analyse these insights using powerful dashboards in Grafana.

3. Real-Time Analytics and Visualisation

For businesses that rely on immediate insights, the combination of DaFT, Prometheus, and Grafana forms a robust real-time analytics stack. DaFT efficiently polls data from various origins—such as web APIs, internal databases, or file systems—then formats the data suitably for Prometheus. Once stored as time-series data, lightweight queries can extract critical metrics which are then seamlessly visualised in Grafana, giving you up-to-the-minute snapshots of performance or operational health.

4. Custom Reporting and Historical Analysis

DaFT’s extensible architecture is ideal for scenarios where you need to integrate historical data from legacy systems with current real-time data. By ensuring consistent output formats, DaFT helps bridge the gap between old and new data repositories. You can query time-series data from Prometheus to generate trend reports over weeks or months and use Grafana to overlay historical context with current operational data—ideal for performance benchmarking or capacity planning.

DaFT stands at the intersection of flexibility, scalability, and security. By unifying data from diverse sources into a consistent format, it empowers organisations to leverage existing monitoring systems with ease. Whether you are enhancing your data visualisation through Grafana or performing lightweight queries in Prometheus, DaFT provides the comprehensive infrastructure necessary for reliable, real-time insights.

Discover how DaFT can transform your data pipelines and drive more informed decision-making across your entire organisation.