Global creative effectiveness platform DAIVID launched a new AI-powered, self-service solution that predicts the emotions and attention levels an ad will generate, and its likely impact on brand and business metrics.
Trained using tens of millions of human responses to ads, DAIVID Self-Serve uses a predictive algorithm that lets advertisers know within minutes the emotional and business impact of a video or image, without the need for audience panels – enabling advertisers to measure the effectiveness of their ad campaigns at scale.
To help brands, agencies and influencers optimise the effectiveness of their ads, users are also given a second-by-second breakdown of the likely emotions specific aspects of their content will generate among their target audiences.
Despite eight out of 10 marketers believing creative quality is the most significant factor in an ad’s success, the sheer volume of branded content produced in today’s content-saturated, always-on world across a range of different platforms, formats and channels means the vast majority goes out without any kind of pre-testing.
Not dependent on expensive and time-consuming panels to measure an ad’s likely emotional and brand impact, DAIVID Self-Serve analyses the ads using AI to give marketers an instant snapshot of the efficacy of their ads, at any scale. It also gives unique insights into how advertisers can improve the performance of their campaigns. DAIVID’s AI models are built by combining facial coding, eye tracking and survey data with computer vision and computer listening APIs.
Companies that have been using the beta version of the solution include GroupM, Nike and influencer marketing agency Billion Dollar Boy.
Tom Saunter, Global Head of Creative Intelligence at GroupM, said: “We’ve long sought a scalable solution for predictive emotional scoring for GroupM clients, and by integrating the new DAIVID Self-Serve solution into our Creative Analytics product suite, we now have access to the widest range of emotional benchmarks on the market, along with valuable second-by-second breakdowns. In an analysis of over 3,000 videos, we’ve proven that DAIVID’s emotional data can be a strong predictor of actual media performance. This is a powerful unlock that will enable the next level of understanding in what drives creative effectiveness at scale.”
DAIVID’s CEO and founder, Ian Forrester, said: “With so much creative being produced these days, it’s easy to understand why brands and agencies feel a bit lost in the ad avalanche. They want to test and improve the effectiveness of their creative, but struggle to deal with the sheer amount being produced, leading to inconsistent results and a poor return on their media spend.
“Through our new DAIVID Self-Serve solution, we have cracked the creative code, helping our clients reduce uncertainty, unleash the emotional power of their creatives and supercharge their campaign effectiveness. In a nutshell, we’ve trained our AI so we can say whether advertising will work or not – and explain why. Our new platform offers advertisers a simple, fast and cost-effective way to measure, benchmark and optimise the effectiveness of all their content, helping them to make better decisions faster.”
To use the platform, users simply upload their content via an API. The predictive AI then analyses the content and lets them know the likely impact of their creative. DAIVID Self-Serve capabilities include:
- A prediction of the percentage of people likely to feel a deep emotional connection to the content, based on 39 distinct emotions;
- A second-by-second breakdown of the emotions an ad will generate among target audiences;
- Predicted percentage of people likely to recall the correct brand, share the content, recommend the brand and buy the product, and the key moments driving each;
- A breakdown of the attention captured by ads during the critical first and last seconds;
- An overall effectiveness score (out of 10) based on predicted attention levels, intensity of positive emotions and brand recall.
DAIVID’s predictive algorithm, which is continually updated to predict how people will respond to the visual and audio stimuli within the ad, has been developed by the company’s team of data scientists, led by Kantar’s former Chief Scientist Mitch Eggers. The data used to train the algorithm has been compiled over the last six years using consumer surveys, facial coding and eye tracking to measure audience panels’ responses to hundreds of ad campaigns.
For more such updates, follow us on Google News Martech News