CANNES — In a world where AI will generate a huge quantity of suspect and synthetic information, the risk to advertisers of a faulty adjacency would seem to grow.
But some vendors are using AI to fight AI, and nowhere more so than in brand safety.
In this video interview with Beet.TV, Craig Ziegler, SVP of Product at Integral Ad Science (IAS), explains what his company is doing.
AI Helps Understand Content Nuances
Brands want to achieve their target performance goals and expand their reach, while simultaneously protecting their brand equity.
“That’s the opportunity,” Ziegler says. Artificial intelligence (AI) already plays a significant role in many of IAS’s products, particularly its Total Media Quality (TMQ) solution.
TMQ was developed to address the specific challenges of social video platforms, where the content is all about “sight, sound, motion” and includes video, image, and audio elements.
“Our AI helps marketers understand the adjacencies of that content and can understand it at a very nuanced level, which is really important because every brand has different requirements, different needs when it comes to suitability,” Ziegler explains.
Staying Ahead of Emerging Formats and Platforms
With new media formats and platforms constantly emerging, brands need to continually adapt their brand safety strategies. IAS recommends that brands regularly review and update their suitability goals and guidelines.
“That’s really a big part of our responsibility, a big part of our product roadmap,” says Ziegler, referring to applying their tools to different forms of content as they evolve.
“You have to be able to understand context at a pretty nuanced level so that you can meet brands’ needs as they define what their strategies look like,” he adds.
You’re watching “Global Leadership Summit: Data, Identity & Measurement, a Beet.TV Leadership Series at Cannes Lions 2024 presented by Digital Turbine, IAS, Intent IQ & TransUnion”. For more videos from this series, please visit this page. To view all of Beet.TV’s Cannes Lions 2024 content, please visit this page.