This article appears in the Summer 2017 issue of strategy.
A shaggy dog in a business suit picks himself up off a rain-soaked Japanese pedestrian mall to a soundtrack of mournful howls, and ingests a Clorets Mint Tab. Green pixels explode from his mouth. The sound shifts to hardy barks as he takes out a stopwatch, runs his paws through his hair, and blasts off into the sky, leaving his business attire behind.
That’s what creatives at McCann Japan came up with based on direction from their “AI-CD.” The agency pitted the machine against a real, flesh-and-blood CD last year to see who could produce a better Clorets ad, conveying instant fresh breath that lasts for 10 minutes.
The AI-CD is a database of award-winning Japanese advertising from the past 10 years. The agency’s creatives deconstructed and analyzed the ads, tagging the elements that made them successful. For Clorets, the AI-CD wanted to “convey ‘wild’ with a song in an urban tone, leaving an image of refreshment with a feeling of liberation,” Ad Age reported. Humans took over from there and made the ad with the mutt.
The competing spot showed a woman on a rooftop writing the brand benefits in calligraphy. It narrowly beat the AI-CD in a blind online consumer poll (54% to 46%).
[iframe_youtube video = “wMQ1AHB2XhQ”]
Despite what some consider a successful use of creative AI, the day when robots make CDs obsolete is probably not imminent. But brands and agencies are taking note of how the relationship with technology is changing roles and creativity more broadly. AI is already being used to target ads and match brands with influencers. How long until it plays a bigger role in the creative process?
Scott Suthren, VP of planning at Cheil Canada, says that is still a solely human domain at his agency. But machines are delivering ads, which affects the creative product. Teams must think in terms of segments, creating variations and no longer relying as much on “a simple, single ad,” he says.
The message of ads targeting consumers in a specific place and time can be somewhat limited by that information. Kellogg’s Vector cereal recently worked with Google and Starcom to make six-second YouTube ads targeting urban Canadians based on their location and weather. A base video of the cereal box could display messages for 20 cities and five different weather patterns, said Natasha Millar, a senior marketing director at Kellogg, in an email.
Served through Google AdWords, the script retrieved the latest weather for each targeted city every hour to ensure it delivered the correct information. Each version pushed Vector’s fitness-positive message: “Chase the clouds Toronto!” or “Edmonton, don’t let the rain set your tempo.”
It was the company’s first test with the dynamic ads, and Millar said it’s “just starting to understand how creative and targeting should influence each other in a programmatic world.”
McCann Japan showed experiments are already underway for AI-generated creative. Shun Matsuzaka, who led the AI-CD project, said in a statement that the robot was “free of bias or habits in the way of thinking that can limit human creators. This allows it to come up with creative direction that humans would never think of.”
One of the arguments creatives bring against big data and machine-generated creative is that it won’t come up with something entirely new. Creating an AI-CD that analyzes a decade of award-winning ads won’t produce anything vastly different from the previous decade’s award-winning ads, the argument goes. It’s synthesizing, not creating.The question is how different this is from the human process, and how helpful it can be anyway.
Suthren says creative agencies can use machines as stimulus tools or brute-force generators of unconventional ideas. “The AI systems do not have any assumptions going in,” he says. “So anything they put out is novel and new and different, and could be creating connections.”
That’s why Russell Davies, chief strategy officer at BETC London, created a tagline-generating Twitter bot. Taglines are “a strange literary form,” he wrote in Campaign. “A narrow range of grammatical possibilities, a pretty limited vocabulary and they don’t really have to make sense.”
His bot, @taglin3r, spits out a new one every hour, uninhibited by human restrictions like logic and syntax. “Only Inspire Your Favourite Delight,” it tweeted the morning this article was written. Not the next “Just Do It,” but it probably beats some enhanced methods for burrowing into a creative brief, like the “vitamin” injection Don Draper and co lined up for in that Mad Men episode.
Speaking at Toronto’s CMDC Conference in April, MIT Media Lab assistant professor Kevin Slavin raised a concern about relying on machines in relation to programs film studios can use as shortcuts for evaluating scripts, by running them against established story arcs. “We as a species are novelty seekers. We sell it, we buy it, we demand it, we require it. And we won’t get there by repeating what we’ve done,” he said.
The opportunity lies in humans working with machines to improve creativity and overcome blockages we fail to recognize, he said. “The greatest use of all this may not be in recognizing patterns that we can’t see but rather breaking the patterns that we can’t see.”
Mirum president Mitch Joel says the best use for machines could be to receive creative briefs outlining what brands want to achieve, spitting out taglines for humans to sift through, and then testing those choices. “Sentiment, imagery, brand history: think about all the inputs that a computer can capitalize on against a database,” he says.
This type of capability is already surfacing in design. Richard Thomas, co-founder of Toronto-based design and technology firm The Foresight Studio, says programs that allow users to input a design problem and receive “potentially millions of solutions” are changing the game.
One of these is Toronto-based Autodesk’s Dreamcatcher. The software lets designers enter a project’s functional requirements, manufacturing method and cost limits. It synthesizes the information and runs it against “a vast number of generated designs” (for a car or a chair, for example) before presenting solutions, its website says. Logojoy, also in Toronto, is offering something more basic for branding: users choose five logos from a few dozen, add colour and icon preferences, and presto – the algorithm presents a number of options. Nutella, working with Ogilvy & Mather Italia, used an algorithm to create seven million versions of its graphic identity, which were sold on unique jars.
Exposure to the volume of possible solutions from programs like Dreamcatcher is “going to change the psychology of designers,” Thomas says. “Right now we do that by reading books and absorbing as much as we can,” but even the most erudite AD can’t compete with the software’s millions of data points.
He says machines will “fundamentally change” the designer’s role into teachers of their tools, feeding algorithms the right information. A lot of the time spent on rote work will be automated, freeing designers, artists and engineers to focus on bigger questions like goals and societal impact.
That may be enough to lift the world-weary dog’s spirits if he ever runs out of Clorets.