What Happens When Display Advertising Gets Even Smarter

Advertising using quality score

It’s not that Dealer.com Display Advertising was lacking intelligence. It’s just that now Display has grown even smarter.

Let us explain.

In early 2017, we launched nGauge, a Dealer.com Analytics product that examines various shopper engagement behaviors with a Dealer.com Website and assigns a quality score to each website visit. The higher the website visit “score”, the more likely it is to result in a vehicle purchase. This provides dealers with the necessary data to make a more informed sales investment, targeting shoppers more likely to buy a vehicle, and adjusting marketing goals to align with shopping intent.

So far, dealers’ response to nGauge has been very positive. In fact, many dealerships have been asking if this technology is available in any of our other products.

*Cue lightbulb turning on.

In continuing our pursuit to deliver the highest quality (not quantity) shoppers to our clients, we asked ourselves: What would happen if we used nGauge quality score technology to inform the machine learning algorithm that powers Dealer.com Display Advertising?

How Our Display Advertising Targeting Had Worked

To make sense of what we discovered, let’s review the inner workings of Dealer.com Display:

Comprising network, remarketing, and audience targeting, Dealer.com Display uses proprietary real-time bidding (RTB) powered by a form of artificial intelligence called machine learning. Simply put, this involves teaching computers to learn in a similar manner as humans do: analyze data, store it, and learn from its successes and failures.

In the advertising world, this translates to consuming information about an impression, examining the outcome, and then realizing either a success or failure. If deemed successful, the algorithm treats the impression (and its unique characteristics) as a positive signal, which is then used to drive performance.

Dealer.com Display, which purchases impressions on behalf of our dealer partners, integrates with numerous advertising exchanges that act as the technology platform responsible for facilitating the “handshake” between advertising buyers and sellers. These ad exchanges are suppliers of inventory, where publishers (websites) generate revenue by permitting impressions (ads) housed in those exchanges on their properties. It’s an enormous digital advertising ecosystem, communicating, transacting, and serving ads in 800 milliseconds, millions of times a day. (For a neat visual representation of the programmatic advertising ecosystem, click here.)

Dealer.com’s real-time bidding receives hundreds of thousands of display advertising requests every second from ad exchanges. The system needs to evaluate each of these requests to determine if it wants to bid at the auction and at what price. For this, Dealer.com machine learning decides what it wants to buy, and at what price, for each live display campaign in our system. Using a proprietary algorithm, each display campaign operates within its own model, and draws from data to assess multiple variables like dealership characteristics, geography, ad placement, and information about the end user, to name a few.

Although each campaign – and automotive advertising agency – has its own unique model, the primary goal of the algorithm is to optimize for clicks back to a website, thus using any impression that resulted in a click as a positive learning experience for the algorithm, no matter what had happened after the click.

How Our Display Advertising Works Now

In spring 2017, and using nGauge’s quality score framework, we experimented with a change to our machine learning algorithm so that it used only quality clicks (clicks from shoppers more likely to purchase) as positive training examples. We hypothesized that this would train display ad campaigns on the traits of shoppers who are most likely to buy a car, and thus make a stronger case to pay a little bit more for impressions that are likely to lead to a quality visit.

To prove or disprove this hypothesis, we took a handful of campaigns (the experimental group) and had them “train” on the new model. We then compared our results to a different set of campaigns that had not changed (the control group).

The Results

This test showed that experimental campaigns had significantly (statistically speaking) higher quality scores than control campaigns.

Overall, we concluded that training our machine learning algorithms on quality clicks resulted in at least a three percent increase in average campaign quality score. The rate at which display advertising campaign clicks resulted in a website visit also increased by 13 percent* (as opposed to immediately bouncing after clicking the ad). This might not seem like a big lift, but when this is applied to all of our display advertising partners, it can result in a significant improvement in ROI and advertising efficiency.

As a result of this experiment, the Dealer.com Advertising engineering teams will officially change our display advertising algorithm to impact all campaigns in Q3 2017.

Analytics are growing ever more sophisticated. As nGauge demonstrates, the industry should no longer consider them an accessory, but instead make them the backbone of every digital marketing effort. Our dealers were first privy to this data in the form of quality website visits. As of this summer, they’ll be able to take advantage of a more effective advertising strategy with Dealer.com Display, which is now using quality traffic signals as a key variable when deciding how to purchase ad impressions.

Display just got a whole lot smarter.

If you have any questions or want any further detail, please feel free to reach out to comment below.

Brent Towne is the senior manager, product management – analytics, and Scott Blodgett is the product manager – advertising at Dealer.com

*There are a few factors at play here, including better avoidance of bidding on bot traffic and accidental clicks, to name a few. For more information on advertising fraud, click here.