Advanced Search
Search Results
27 total results found
Data Federation Tool
DaFT, the Data Federation Tool documentation
Omniplexer
Introduction
Installation
Supported Data Sources
Export Formats
Usage
Configuration
DaFT Installation Guide
Follow the steps below to get DaFT up and running on your server. 1. Clone the Repository Begin by cloning the repository to your local machine or server: git clone https://github.com/andydixon/DaFT.git cd DaFT Alternatively, you can download the repo...
About DaFT
DaFT is your all-in-one solution for ensuring that every data output is delivered in a consistent, normalised format—regardless of the source or style of the output. Our mission is to simplify and streamline the integration of your data with the leading monito...
JSON Parsing (from process or URL)
The web-json and app-json handlers in DaFT allow for powerful JSON extraction and filtering, enabling users to: Extract specific paths from deeply nested JSON. Select only certain fields from the extracted data. Maintain full backward compatibility (defau...
XML Parsing (From process or URL)
The web-xml and app-xml handlers in DaFT provide advanced XML extraction and filtering capabilities, allowing users to: Extract specific paths from deeply nested XML. Select only certain fields from the extracted data. Convert XML to JSON-like structures ...
Amazon Athena
This has not been tested, and has been written to an API spec and examples found. When integrating Amazon Athena as a data source, you’ll be using an ODBC connection. This requires installing Amazon’s official Athena ODBC driver on your server. Download the ...
CSV File (from process or URL)
InfluxDB
Kafka
This has not been tested, and has been written to an API spec and examples found.
Microsoft SQL Server
When integrating a Microsoft SQL Server (MSSQL) data source using this handler, the configuration remains in .ini format, with some fields consistent across database types, and some that are specific to MSSQL. Here’s an example MSSQL configuration: [example_...
MySQL Server
When integrating a MySQL data source using this handler, the configuration is defined in an .ini format. Each section within the configuration file represents a different data source profile. Below is an example configuration and a breakdown of each field: [e...
Prometheus Scraping
Postgres and Redshift
When integrating a PostgreSQL or Amazon Redshift data source using this handler, the configuration is defined in an .ini format. Each section within the configuration file represents a distinct data source profile. Below is an example configuration and a break...
Splunk
This has not been tested, and has been written to an API spec and examples found.
Prometheus / OpenMetrics
Monitoring and observability are crucial parts of any modern system. Two of the most widely adopted formats for exposing metrics are Prometheus and OpenMetrics. While they share similarities, they also have distinct characteristics and use cases. Below is a de...
Telegraf
Telegraf is a powerful agent for collecting, processing, and writing metrics. Developed by InfluxData, Telegraf is part of the TICK stack and is designed for flexibility — it can collect data from a wide range of inputs and output them to various systems. Whe...
Configuring
To get started with configuring your data handlers, follow the steps below. This will ensure that your data sources are properly defined and accessible via the expected routes. Step 1: Copy the Example Configuration Begin by copying the example configuration...
Introduction
Access DaFT via your browser, scraper, or any HTTP client using the following pattern: http://<YOUR_HOST>/<identifier> The output format is determined by the endpoint you use: Default: JSON format (ideal for general use or integration with Telegraf). P...
Configuring Prometheus
To set up Prometheus to scrape metrics from your application or service, follow the steps below. 1. Edit prometheus.yml Open your prometheus.yml file and add the following section under scrape_configs: - job_name: "<identifier>" static_configs: - ta...
Configuring Telegraf
To configure Telegraf to collect metrics from your application or service, follow the steps below. 1. Create a Telegraf Configuration File Create a new configuration file in the /etc/telegraf/telegraf.d/ directory. For example, you might name it DaFT.conf. ...