AT Internet has developed the Data Flow feature, a new API to extract large amounts of data, at a very fine, granular level of detail (hit-level). It is an additional product that you can subscribe to, at a site or contract-level. For any questions about pricing or trials, please contact your dedicated salesperson, or contact the AT Internet Support Center.
As we need to understand the behavior of our visitors in an ever-changing context, it is crucial to obtain detailed, precise data as quickly as possible. With this in mind, the Data Flow feature allows you to extract data hour by hour, as soon as the hour in question is completed and the data is calculated (there is a delay of between 3 to 5 minutes for real-time processing). This feature is dedicated to extracting potentially all actions taken by every visitor on a website. The data exported will afterwards be imported into Data Warehouses or Data Lakes for data mining purposes. As opposed to the Reporting API (Data Query) dedicated to extracting specific range of data, Data Flow should be used to get the finest granularity of data.
Having the data in your own Data Warehouses or Data Lakes will make it easier for you to cross-reference the data with other external data sources (e.g. CRM, Email, etc.), and answer more complex business questions such as advanced sequential analyses, or redefine your own attribution models as your business evolves.
Was this post helpful?