Click Fraud Survey & Discussion

Wednesday, December 14, 2005

Pay-Per-Click Model Doomed? Another Click Fraud Threat

After reading the posts below from Yahoo! Finance's GOOG message board, it appears that click fraudsters have found yet another reason to engage in the lowly practice--boosting the stock price of the PPC businesses--i.e. Yahoo! and Google. How many clicks does it take to raise GOOG's share price $1? Click fraudsters want to know. Read it all.

"It is as legal as you go to the department stores just browsing and not buying. I bought GOOG at IPO and sold all my shares recently when it hit $425. My reason is that it is about time for the advertisers to find out that the ads are not as effective as they think they are. Sure Google has some mechanicism to filter computer generated clicks and refund some money, but it has no way to outsmart human beings. When I were one of the proud GOOG share holders, I frequently posted here to advise other share holders how to "effectively" click and help increase GOOG revenue. The key point is to use different key words search every time. You always want to use popular key words to generate the maximum revenue for Google, but use the combination only once a day. As for me, I have three computers under my desk. I only use the main one for serious purpose. The other two are connected to my home network, mostly doing computation work. Occasionally I google and click on ad links using these two computers when I feel like it. For those companies, they just think I am one of the consumers come and not buy."

Shocking, sickening and infuriating.

"$500 MILLION from shareholder CLICKS every quarter:
the SEC doesn't care:everyday CLICK at least 50 times, its easy and you help a worthy cause: your checkbook
50 click-throughs per day x 30 days x 3 months x $1 a click = $ 4500 to GOOG bottom line !!!!
all Google shareholders are doing it - JOIN INthere are 100,000 shareholders: $4500 each per quarter means we have been contributing $500 mILLION to google revenues...NOT BAD"


If you were an advertiser wouldn't you be worried about this? Maybe Google is helping your business, but that does not make fraud tolerable. An easy fix would be to do away with pay-per-click ads, but Google & Yahoo have no incentive to do that. In essence, these companies have a business model allowing them and their owners (shareholders) to print money. And this will persist for as long as advertisers can afford it. So, is Google a pyramid scheme, using the pyramid model as a financial springboard from which to jump into more legitimate businesses? You decide. I already have, and something stinks.

Part 2: Google Sends a Non-Boilerplate Response, HOLY COW!

This is the third (and in my opinion, the most telling) of three very interesting articles about Google's Adwords written by Robert X. Cringely, one of Silicon Valley's brightest minds over the past 20 years. See below for the two other parts.


My friend finally finished his AdWords experiment after 36 days. He was trying to do a Taguchi optimization of his AdWords campaign, something that has so far eluded Taguchi experts including the PhD (my original contact) who was hired to help him. The Taguchi Methods of Robust Design are techniques for testing hundreds or even thousands of variables with only a handful of actual experiments. Taguchi considers the process as a black box and looks only at inputs and outputs. In many ways it is a kind of codified reverse-engineering system, but one that is very well proven since its original invention in 1945.

Alas, Taguchi doesn't work well with AdWords for some reason. Just when things are starting to make sense, they stop doing so. That's the way it was for my friend. Things were going well until suddenly they changed for a five-day period ending with the publication of my first AdWords column. The AdWords black box could be optimized for a while, but then it couldn't be. And then it could be again.

This was annoying for the Taguchi experts who were quite used to optimizing black boxes with thousands of internal variables that are never identified or seen. What was going on here? What was introducing what Taguchi calls "noise factors" that kept the optimization from being achieved?
The Taguchi experts concluded that the noise factor was probably some form of Bayes-Nash equilibrium experiment being conducted by AdWords, itself. The Taguchi expectation of rational behavior was being confounded, they believed, by deliberate manipulation intended to move the Nash Equilibrium point improving mid-to-long-term profit for Google. This is possible because, unlike a traditional Nash Equilibrium experiment where all bids are known, the AdWords algorithm is unknown, though (incorrectly) presumed to be rational. In other words, AdWords was deliberately giving up income in the short term (that's considered irrational behavior) to coax AdWords advertisers to bid higher for words, thus leading to greater revenue and profit for Google in the long term.

Those are the conclusions of the experimenters, not me. So I ran their conclusions by Jeff Huber at Google, who is in charge of engineering for AdWords. I asked a simple question: Is that what you were doing? "In short, no," said Jeff. We are not intentionally introducing "noise factors" or any other perturbations in the style the proposed theory suggests to affect near-term or long-term revenues.
I know it's not quite as exciting, but our model is pretty simple. We want our users to have the most relevant possible content, so we work really hard on an on-going basis to optimize end-user perceived quality. We want our advertisers to have a great return on their investment spent with us. If we do both of these well we'll continue to do fine financially -- more users will come to Google because we provide the most relevant content and the best experience, and more advertisers will come because we provide qualified and cost-effective introductions for their business and they make a lot of money. We may well give up revenue in the near-term to improve end-user perceived quality; this may not appear "rational" to an advertiser (or given modeling method, or wall street analyst for that matter!) to optimize on relevance rather than near-term revenue, but we think it's the right long-term thing to do for users, advertisers, and us.

I'm not an expert on Taguchi methods, but have seen the approach very successfully applied to systems that have relatively linear behavior between inputs & outputs (for example, many websites now use the approach to optimize their home page landing pages based on different test layouts; obviously it has an rich history in manufacturing).

As mentioned in my prior note, the AdWords system is highly dynamic and incorporates user behavior, optimization on end-user relevance (which is continually evolving and improving), competitor behavior, historical performance of advertising campaigns, and non-linear effects based on position, campaign configurations (e.g., rate limiting to manage to advertiser-defined budgets), and policy enforcement (e.g., editorial review on ad creative changes, algorithms to encourage diversity and minimize redundant ads). I think I understand the experiment's spirit and intent, but the implementation could also have run into our double-serving policy where multiple ads are attempting to target the same terms and serving the same or highly similar (re-named) content.

How and if AdWords could be modeled with Taguchi methods, or other approaches, might be an interesting PhD research thesis topic. Without knowing the accounts involved and how the experiment was conducted, it's unfortunately pretty hard to conclude which factors may have been at play in the 5 days that your friends had trouble modeling. I'll reiterate our offer to help investigate if your friend would like; all we'd need is the account(s) used in the experiment.

Who's right? Who's wrong? Does any of this really matter? And is the concept of "evil," which Google claims to avoid, even remotely involved? Beats me.

It isn't really clear what's going on here, though the experimental side was organized and professional enough that I would tend to believe they were observing SOMETHING, whatever was causing it. Even if Google was trying to optimize AdWords in that way, there is nothing illegal in it. If they don't do it, someone else will.

Notice how in Jeff's explanation, above, he said the AdWords system adjusts for maximum profitability for Google and "maximum user perceived quality for the advertisers." Not the maximum return on investment for their AdWords budget. The key word is PERCEIVED.

Finally, several months ago Google (and Overture, too) were quite specifically advertising positions for experts in Bernoulli-Nash Equilibria optimization. These were jobs not for programmers, but Operations Research types. One of those people recruited was a friend of mine for many years. From what he told me Google and Overture were looking for EXACTLY the kind of technical capability described above.

Is this a big deal? Not for most people. Not even for most AdWords users who are probably making plenty of money from their campaigns otherwise -- as we now know from Nash -- they wouldn't be doing it at all. But for those AdWords users like my friend the experimenter, it probably means that truly optimizing an AdWords campaign is going to be a lot harder job. I think it can be done, but it won't be easy.

Survey Update

For this analysis to work I need numbers folks.

And tell others to come and post their stories.

Thank you.

Google Sends a Non-Boilerplate Response, HOLY COW!

To Commenter #4: The article below may explain your problem with decreased ROI.

What follows is an excerpt combined from three articles written by Robert X. Cringely, one of Silicon Valley's brightest minds over the past 20 years. (The articles are from 9/22/05, 10/6/05, and 10/13/05).

9/22/05: (half way down)
One of my readers makes his living selling goods over the Internet, and his sole means of obtaining customers is through Google AdWords. His business is robust for a one-man operation and he makes a good living. Knowing the actual numbers, I would say he makes a VERY good living, which shows the effectiveness of Google and AdWords as an advertising medium.
But one can never make enough money, it seems, so this reader decided to do some research to see if he could improve his results by modifying this and that. He decided that the best way to conduct this research was not by altering variables on his existing, very profitable web site, but by creating a separate site purely to be used for these tests.

Clearly, this is a behavior that the big brains of Google did not expect.
It was no big deal to create a separate experimental site. Web hosting companies offer e-commerce sites for only a few dollars per month. A Google AdWords account costs only $5.00 to set up. The actual content of the new web site could simply be copied over from the pre-existing site and changed at will as dictated by the experiment.
Most people would alter variables on the main site and see what happens, but this guy didn't want to mess with the success he was already achieving, so he came up with this parallel experimental design.

The first thing he wanted to study was the impact of paying more or less for AdWords. He knew that paying more would result in higher placement -- especially given that both the ads and the AdWords would be identical between the two sites. What he didn't know, however, was whether a slight increase in cost-per-word would more than pay for itself in increased sales, or whether a slight decrease would go effectively unnoticed, thereby increasing his profit margins.

His old site with the same ads had been running successfully for a year paying at the relatively low rate of $0.10 per word (the AdWords minimum is $0.05 per word) and generating about 15,000 clicks-through per day. But for the new site, he started out paying $1.00 per word for exactly the same words. Based on everything he had read about AdWords (remember nobody actually SPEAKS to Google about these things -- the service is totally automated from Google's end), he expected his ad to move higher in the rankings and, hopefully, to make more sales as a result. And that's exactly what happened, though not to the extent that he would have liked.
Buying AdWords at $1.00 versus $0.10, his ads DID move higher on the page and his revenue was increased, though not by enough to justify going all the way to $1.00 with its associated higher cost basis.

All the while, of course, the essentially identical original web site was churning along, still entirely dependent on AdWords, still carrying identical ads for identical products as the test site, and still generating an average of 15,000 click-throughs per day. Now it was time to drop the per-word price a bit on the test site to see whether he could increase his profit margins after paying too much at $1.00. So he set the new per-word price at $0.40 -- still four times as much as he was paying per word through his main site. And his clicks-through dropped from 15,000+ to 1,200 per day. Huh?

Same products, same ads, same service, but by paying four times MORE than his main site his results dropped by an order of magnitude.
A bit more experimenting showed a similar effect and he was never again able to match the success of his original site as it continually operated in precisely the same market with precisely the same services over the exact same period of time.

I have no idea what the heck is happening here, but my friendly reader, who makes his living from this stuff, has a theory. He believes the Google AdWords algorithm tries to do many things and one of those is to encourage advertisers to pay more for words. By modifying something that in turn modifies the results, Google is effectively encouraging advertisers to change their behavior.

So increasing the amount per word DID increase sales, though not enough to justify the additional cost. Google's revenue per word, of course, went up by 10X. But dropping the price by more than half was greeted by a huge decrease in clicks-through that could only have resulted from some unknown resultant change in GOOGLE's behavior, given that all other variables were constant.

If that's indeed what's happening, it isn't illegal and to some might not even be unethical (I guess) but it feels just a little bit EVIL. Ironically the only way this could be observed was though the use of parallel, otherwise identical web sites and AdWords accounts.
"It's like Vegas," said my friend. "They want you to lose. Try to game the system and they cut off one of your legs."

Now back to Google AdWords, which I wrote about two weeks ago (it's in the links) and promised an update today. To review, a friend of mine conducted some AdWords tests in an effort to fine-tune his yield as an already successful advertiser. He set up a parallel account using the same ads that had worked well before and found that raising his bid-per-word helped sales to a certain extent, but then reducing that bid led to a precipitous decline BELOW the level of clicks-through that he was getting all along with his original site at a significantly lower bid. What gives? I asked. Was Google punishing him for lowering his bid in this controlled experiment?

Readers weighed-in, generally cautioning that the experiment wasn't at all controlled because the new site was new. If it and the original site were exactly the same age then it might, indeed, control for time effects, but as it was presented the experiment was probably just showing that the AdWords algorithm somehow gives preference to sites with longevity.
Other readers took exception to the idea that a single experiment could show any reliable results. Give then 100 or more experiments, they said.

For his part, Google VP of Engineering Jeff Huber offered to help:
If you or your friend would be interested, we'd be happy to take a closer look at the behavior experienced and help understand what might be going on. I can vouch that we do take our company motto of "don't be evil" very seriously, and we work very hard to ensure that our advertisers get great return-on-investment for the money they spend with AdWords.
So, on further thought and without specific data on the account(s) involved, here are some potential non-evil explanations of what might be happening:

1. ad position does have non-linear behavior -- higher position ads tend to get a multiple of clicks of lower position ads,

2. auction dynamics can change if any additional entrants/competitors began in the period (it is very dynamic -- think of a marketplace that does many multiples of daily transactions of the NYSE or NASDAQ),

3. end-user perceived quality, as measured by ad click-through-rates, substantially affects ad ranking (more than bid price, in fact; we hope & believe that our emphasis on end-user quality & relevance is fairly non-evil :-)

4. in some areas there can also be a "brand effect" to end-user perceived quality, where users will choose brands they recognize & trust more; if your friend's established business has established brand effect in his/her segment, it could be getting higher click through rates, and therefore getting higher/better placement and more traffic, with lower keyword bids,

5. we also strive to provide diversity of high quality ads for better end-user quality; if the ads in question were pure duplicates (e.g., in destination URL) attempting to show on the same end-user queries, they may get a lot less traffic than the original account/campaign (which has an established history & track record of high quality)
And there may be others. With more data from the accounts in question, we should be able to tell a lot more. If you'd like to connect me with your friend, we'll take it from there -- all we'll need is account ids/e-mail addresses from the experiment.

My friend preferred not to have Google help him with his experiment, which by the way was one of 17 experiments based on well-established research methodology. What was so peculiar about his results was that the market in question has very little competition so both of his ads (and those of his single real competitor) were nearly always on the first page no matter what the bid price. Odder still was the fact that shortly after my AdWords column appeared, THE EFFECT SUDDENLY STOPPED. Go figure.

Monday, December 12, 2005


Google and Yahoo, the two biggest pay-per-click business providers, do not provide users with enough information when it comes to click fraud. Their inaction has led to my action, and together we can fight back.

Once upon a time, I had a Google Adwords account... and since this is a blog about click fraud you know what transpired. I started getting lots of extra clicks, but not a corresponding increase in the business. Google didn't help, and since watching, detecting, and claiming click fraud are full time jobs themselves, I threw in the towel and shut down. Sad story, yes, but I'm doing something about it.

I am conducting a survey on click fraud to figure out what's going on and to make sense of the click fraud world, which is only murky because of the way Google and Yahoo choose to side-step the issue. I'll post the results here FOR EVERYONE TO SEE. It should be interesting to see what people have to say. And as far as I know you will be participating in the first ever click fraud study! You're a pioneer!

So, all Google, Yahoo, or other pay-per click advertisers, would you please answer the following questions and either email them to me at or post here? Thank you very much, and I look forward to seeing what we uncover.

1. What service do you/did you use? (Google (Adwords or Adsense), Yahoo, something else)

2. Are you still using the service?

3. Approximately how many clicks per day do you/did you receive?

4. Type of goods sold?

5. Approximately how many keywords do you/did you pay for?

6. What is/was your average cost per click?

7. Did you ever complain to your service provider about click fraud? Were they helpful? Did you get a refund?

Thank you and look for the results soon!