These findings tell us which ads are working, the impact of creative and what messages resonate. But how often do we analyze data to identify negative aspects of operations? It’s important to aggregate your view and click data to better visualize potential anomalies in your trafficking – including where you may be missing data.
For example, consider a common mistake in trafficking: Let's say you utilize Google Analytics (GA) parameters in your ad click-throughs, something like "?utm_source=offer1". In GA's reporting, it may be difficult to determine whether a smaller ad source included said parameters in the click-through link. But if you analyze your click data by URL, you are likely to notice the shorter URL, discovering a click count separate from that particular ad source.
Along similar lines, when running large campaigns that can often switch destinations – coupled with varying publication sources to edit and change – proficient analysis can help pinpoint ad sources that didn't get changed. Looking for anomalies in traffic data also helps minimize financial risks by identifying potential click-fraud and incorrect view / landing data, which can set data incorrectly for conversion pixel link-up, resulting in missed conversions.
At IMM, we monitor large volumes of data in-house and use the above methods to ensure consistent flow, starting with initial data collection. We are then able to get hourly data for near real-time reporting. Thresholds are set for the amount of information to expect, so when a data stream dips below threshold, our data-ingestion team is prompted to investigate. It’s much easier to find and fix problems upfront vs. running into them at the end of the process in campaign reports.
Whether reporting on behalf of clients or for internal purposes, it’s important to recognize the two-way street that data provides – not only to measure positive performance, but to identify and solve for vulnerabilities.