Loading...
You are here:  Home  >  Innovations  >  Current Article

Warning signs show that automated bidding systems may turn into Skynet

By   /  July 13, 2015  /  No Comments

Skynet

Occasionally, when I read a news piece about some crazy new technology, such as Amazon’s idea to eventually use delivery drones, I think to myself, “Haven’t these people seen Terminator?” Denis Nekipelov and I are of the same mindset because he researched automated bidding algorithms and wondered if they might not become self-aware à la Skynet in Terminator.

Phys.org interviewed Nekipelov, an Associate Professor of Economics and Computer Science at the University of Virginia, who did a study on how auto-bidding algorithms could eventually cause a financial collapse if allowed to run amok. Currently, automated bidding systems simply do what people can’t do in the same volume or at the same speed. When you do a Google search for “cat toys,” there is a brief brawl behind the scenes, and the ads that win end up being displayed to you. Automated bidding makes that happen.

Developers are also testing algorithms that can learn how these systems work and make bidding adjustments on the spot. For example, bidding might increase leading up to Black Friday or around other major sales holidays because the algorithm will have “learned” when to really dig in and compete. However, the problem is that smart algorithms aren’t actually, well, smart.

One example of this is when competitive bidding software on Amazon allowed a fairly innocuous book about genetics, The Making of a Fly, to have a listed sale price of around $24 million.

Amazon's million dollar book

Amazon’s million dollar book

The bidding system used an exponential pricing system, so when one of the two sellers of the book raised their price, an avalanche of dueling price increases followed.

As a result of the inefficiency and inaccuracy of bidding systems, advertisers are able to pay for ad space at about 60% of its actual value. Good for advertisers, but the pendulum could theoretically swing the other way. This is especially true if more advanced learning algorithms are introduced and given more leeway and less oversight.

A recent example of excellent intentions with questionable outcomes is the fact that Google’s ad systems appears to be sexist based on what it has “learned” about users and human behaviors. Unlike with the ad display system itself, there is a fairly simply solution to keep an ad algorithm from bankrupting advertisers and and leading to:

… a Terminator-style scenario with machines limiting human intervention. A misstep in one algorithm could illogically inflate or deflate prices and skew the market.

That is to simply ensure that humans are in control and keeping an eye on things at all times. In movies, you see people getting through super advanced security systems involving fingerprint and retina scanning and recognition sytems all the time by cutting off a hand or gouging out an eyeball and showing it to the scanner. This works because computers are dumb.

On the other hand, you put a security guard in a control booth and they’re going to wonder why someone is sneaking around with severed fingers and eyeballs on a fork. The main issue with this is that people, especially computer people, get this idea that computers are so smart and can take over a workload and think they can set it and forget it. The problem is that computers can’t actually learn – at least not like a human – and aren’t actually smart. If they were, it wouldn’t have taken a human being to see that $24 million is a touch steep for a book.

About the author

I'm an avid reader of stuff and devour information of all kind. For the past four years, I've been pursuing my passion for writing. When I'm not reading or writing, you'll find me knitting. Follow me on twitter: @MarilynMaupinTS

Leave a Reply

Your email address will not be published. Required fields are marked *

You might also like...

Using ad views as currency is spreading

Read More →