Feature List

Data Validation across areas

Automagical reconciliation of data between areas and alerting of any anomaly’s.

Rule execution moved to pub/sub

Refactored rules hand offs to decouple them from direct hand off’s to using pub/sub.

Automated refresh of config for Lineage Maps

Viewing linage maps automagically triggers a sync of config sheets.

Dependency Manifest

Dynamically determine dependencies across rules at time of rule execution to ensure immediate consistency of concept, detail and event data when refreshing Consume views.

Consume API

Ability to automagically push Consume views to a csv file to enable them to be consumed via an API.

Collection effective dated

Ability to add a effective date to Landing rules to define a date key for determining change data when loading History.

Google Secret Manager for Third Party Access

Leverage Google Secret Manager to provide a central place and single source of truth to manage, access, and audit secrets for third party app integration.

Levenshtein Distance Rule Pattern

Added Levenshtein rule pattern to allow Single Magic Record (Master Data Management) matching.

DDL Conversion

Extract table DDL from Oracle and SQL Server systems of records and convert it to BigQuery DDL to automatically create the tables in AgileData.io.

Shopify Change Rules

Define default change rules for Shopify, including Customers, Products, Orders and Transactions.

Unest Change Rule

Ability to define an unest rule step in a change rule.

Slack Logging Notifications Channel

Integrate Slack as a notification channel to view runtime logs and errors

Schedule based invocation of rules

Rules can be triggered/innovated based on a fixed schedule, for example at 2am each working day.

Shopify API Collection Rule

Change rule to allow you to automatically collect data from the Shopify API.

Rule and catalog API

API to allow rules and catalog entries to be called via the GUI or any other mechanism.

Rule execution code visibility

You can now see the code that will be executed for a rule.

Rule Natural Language Parsing

First version of natural language parsing of rules to:

  • identify key change rules words

  • define key words for rule pattern execution

Natural Language Rule Framework

Allows the parsing of a rule and storage of the rule pattern mapping in a dedicated high performance data repository to allow multiple concurrent requests from the GUI.

Concept and Detail Storage

Concepts and Detail catalog entries are now stored in a dedicated high performance data repository to allow multiple concurrent requests from the GUI.

Rule Storage

Rules are now stored in a dedicated high performance data repository to allow multiple concurrent requests from the GUI.

Docs.AgileData.io

Automated deployment framework for documentation and documentation site setup.

Rule Validation

Rules SQL is validated before rule is submitted to pub/sub for execution.

Automated User provisioning

Users are automatically provisioned when tenancy is created.

Soundex Rule Pattern

Added soundex rule pattern to allow Master Data Management matching.

Automatic provisioning of security groups

When automatically provisioning a tenancy standard security groups are also automatically deployed.

Filedrop bucket naming

Improved security of filedrop area by anonymising the filedrop bucket names to ensure they are not discoverable.

Separation of the filedrop area

The filedrop area is separated from other data areas to increase security in depth.

Consume rules

You can create a rule that uses a consume table as an input and outputs another consume table. This is useful when you want to create custom consume views with “pretty” field names (aka a semantic layer) for your visualisation or analytics tool to use.

Automatic creation of consume views in the consume area

Consume views are automatically created in the consume area when the consume tables are created in the event/consume layer.

Separation of the consume area

The consume area is separated from other data areas to increase security in depth.

Data PII profiling

Profile data to identify any personally identifiable data that is being stored.

DDL files to create history tables

Allows files containing DDL to be droped into filedrop and they generate the table structure for the history tables.

Version Change Rules

When a change rule is updated and the rule executed the rules is versioned in config.

Data profiling stored in catalog

Results of the data profiling is stored against the object in the data catalog.

Data Lineage

Initial cut of tracing lineage of data in filedrop all the way through history and event processing to the consume tables.

Updated Detail Fields

When you change the field list for a Detail change rule, the Detail table is dropped and rebuilt on next load to accommodate the field changes.

Rollback of loads

Option to reset load watermarks to force reload of data into history, events or consume.

Load statistics

Persist load statistics for data movement in filedrop, history, events and consume.

Rule execution state

Manage rule execution state to ensure two rules cannot simultaneously update the same concept, detail or event table

Callbacks

Rules issue a callback to dependent rules to remove the risk of time out issues with multiple dependent rule execution.

Filedrop based invocation of rules

When a file is dropped the relevant rules tat are dependent on that file will automatically execute.

Pub/Sub

Rule are executed via a publish and subscribe pattern rather than a data flow pattern to allow rules to be authored and executed in isolation of other rules, while also allowing them to be combined into an end-to-end data pipeline.

Autodetect preloaded csv file in filedrop

When dropping a new csv file into the filedrop area the file is compared against the previous file loaded and if it is the same the file is not reloaded.

Autodetect csv file metadata

For new files the ingestion process automatically samples the csv files and determines the file structure. This create a change rule to load the data into a history table. The metadata description and change rule is persisted is retained to ensure the same change rule is used on subsequent files.

Auto Generate Consume Tables

After a change rule has executed consume tables are created. Concepts and any related details are denormalised into a single table. Events, the relevant Concepts that are part of the event definition and the details for those concepts are denormalised into consume tables. All tables are as at the current point in time.

Rule Master Pattern - Output Concat Key

Allows the selection of multiple fields to use as the business key for a History table, Concept, Detail or Event.

Rule Master Pattern - Output Key

Allows the selection of the field to use as the business key for a Concept, Detail or Event.

Calculation change rule pattern

Allow the use of a formula to calculate a value, for example a / b

Aggregated change rule pattern

Allow the use of a formula to calculate a value, for example sum (a)

Parse change rule pattern

Allow the use of a formula to parse a field, for example split(trim(a),’ ‘) : a1

Single table relationship change rule pattern

Join the fields in a single table.

Multiple table relationship change rule pattern

Join fields across two tables.

Field filter change rule pattern

Filter on a field for a given value.

New event change rule pattern

Config support the creation of event records via a change rule.

New detail change rule pattern

Config support the creation of a detail records for a concept via a simple change rule.

New concept change rule pattern

Config support the creation of a concept records via a simple change rule.

Change data recognition and history point in time updates

When new data is ingested into the history tables, and similar data has previously been ingested into those tables, the ingestion processes identifies new records versus updated records. For records that have changed rather than being new, the previous version of that record is end dated.

File drop area

Users are able to drop a file into a filedrop bucket to enable it to be consumed into the history tables.

Table Of Contents