Sunday, February 27, 2011

Googles Pimp Hand is Strong: The Dirty Little Secret Behind App Revenue

I wrote this about a year ago because I did not like the commoditization of content going on in the iTunes store. As games creators started to build for the store, I saw the train and I saw the wall. I just did not know how much track was in between. There seems to be a false sense of creator liberation, leaving the indentured servitude of the evil publishing overlords for a life of rainbows, unicorns and green fields of "direct" publishing and direct contact with the audience . However, as I pointed out over a year ago, the iTunes store is not the haven for creator profitability, it is really an aggregation of stuff large enough to create a compelling argument for the purchase of a high margin piece of hardware. This picture is not so pretty, but it looks like the work of Ansel Adams relative to Google's impact. Through Google's admob's division's domination of the app market on both iOS and Android systems, Google has turned game developers from indentured servants to whores.

Today's game maker is indeed selling directly in a meaningful way for the first time since about 1994, but they are receiving only USD .70 to USD 7.00 per unit sold. App development is running anywhere from tens of thousands of dollars to over a million for things like Infinity Blade, with most of the top selling apps falling in the USD 200,000 to USD 400,000 range - or about the same as the last time developers sold directly in a meaningful way. Some may argue the market is larger, but this pricing scenario certainly does not make for the robust market console and PC games enjoyed when the budgets were at those levels and publishers were taking anywhere from USD 39.95 to USD 99.95 per unit sold. Our audience was indeed smaller, but the budget for the first Tomb Raider was within the range of today's apps and generated over USD 200 million - and it was not the only PlayStation game performing at that level. Many will be quick to point out the lack of COGS, but they just also acknowledge, all the customer acquisition costs remain, and are exacerbated by a much more crowded market. I am not alone the only one to recognize the shortcomings of the sale model, in fact, most developers see it and either skip sale altogether, or relegate it to a second revenue stream through by making an ad supported model of their apps available.

A survey of the top app developers reveals advertising as the major source of revenue. On Android, it is the only source of revenue. Google, through admob, dominates both iOS and Android ad sales. Android developers who lament Google's failure to provide an iTunes like presence for aggregation and sale of apps, fail to realize Google has no incentive to do so. While the developer who put time, tears and sweat into an app sees a game or great work of staggering genius custom designed to unseat Angry Birds and bring joy to greater masses of humanity of than any other product in the history of mankind, Google sees only filler. Like television executives, the content between and around the ads and the people who make them are a necessary impediment to business they begrudgingly deal with. Google joined Apple in an even stronger effort to commoditize content.

In his recent book, "The War for Late Night: When Leno Went Early and Television Went Crazy," Bill Carter recounts a meeting Lorne Michaels, iconic producer of Saturday Night Live, had with the head of NBC when he was about to resign his role as producer. Irwin Seglestein, the executive in charge of creative listened to Michaels lay out the idealistic reasons he had to resign before Seglestein explained:

When you leave, the show will get worse. But not all of a sudden- gradually. And it will take the audience a while to figure that out. Maybe two, maybe three years And when it gets to be, you know, awful, and the audience has abandoned it, then we will cancel it. And the show will be gone, but we will still be here, because we're the network and we are eternal. If you ready your contract closely, it says that the show is to be ninety minutes in length. It is to cost X. That's the budget. Nowhere in that do we ever say that it has to be good. And if you are so robotic and driven that you feel the pressure to push yourself in that way to make it good, don't come to us and say you've been treated unfairly, because you're trying trying hard to make it good and we're getting in your way. Becasue at no point did we ask for it to be good. That you're neurotic is a bonus to us. Our job is to lie, cheat and steal - and your job is to the show.

Google's mission is to sell ads. They do not care what the ads go into. If a sponsor does not like a certain type of app, no big deal - to Google - they move on to something else. They have hundreds of thousands to pick from and Google will always be there. The developer, on the other hand, put in sweat equity, incurred opportunity cost and maybe even their own money to get the thing up and running. Even a relatively big developer like Rovio, which built over 50 games before Angry Birds does not have the portfolio or agility of Google. Mobile encourages a high click through, commanding a relatively high cpm. The only thing Google needs are placeholders to prevent the ads from being served into black screens. They do not want your game, they want your audience. In fact, they want the opposite of your game. They want new stuff. They want your game while it is new, and then, like the network, they effectively cancel it and want a new game, thereby leading to an inevitable churn in the app market. For Google churn is good. For an app developer, it is death. Because if Google, far and away the leading ad provider decides it does not want to fill the inventory on your app, you may as well pack it up and go home. This begs the question, if Google is providing your main revenue stream, and the players receive your product for free, who is your customer and what are you selling them? I say it is Google and your product is eyeballs.

App providers rushing into the ad supported market who say their product is a game are no different than dot com guys who were going to get rich selling dollar bills for ninety-nine cents. If most of your games are being given away for free, they are not your product. You are monetizing the network of eyeballs created by the free distribution of the game. You sell those eyeballs to Google. Google will continue to buy those eyeballs, if they are happy with your app. You may get millions of downloads of a free porn viewer for Android, but your customer, Google will not be happy and will not pay for your eyeballs. In this regard, Google's pimp hand is strong. The choice of the app developer is to cater to Google, or hope they can sell enough apps to support their business. While the direct sales model may be stressed by pricing in the iTunes store, there is some light at the end of the tunnel in the form of digital objects and the growth of freemium games.

In this scenario, an app creator shifts the Google focus, and with it the designation of customer and product from Google and eyeballs to gamers and the app itself. Unlike the box it, ship it, on to the next days of console games, purveyors of these apps are building living breathing products and responding directly to their customers' voracious appetite for content. Development costs are often as high or higher after launch as during initial development, but the payoff is a long and healthy revenue stream. Today this remains a much more "roast duck or no dinner market" than the finger twitchy ad driven app games, but hopefully this market will grow enough to allow a migration back to the good old days of indentured servitude and commodity pricing of the iTunes store.

Sunday, February 20, 2011

The Stupidity of Crowds and The Wisdom of Aspergers: Death of Innovation in America

A few posts ago I promised to write about metrics based design. The kind of stuff going on inside the social game companies who like to listen to the crowds over the designers. I was thinking and thinking and then I realized that even though it just does not feel right to me, I am a “suit,” not a game designer and I am not going to figure it out. But it led to me to think about a bigger issue. The power of the Internet and growth of social network brings an increased focus to the "Wisdom of Crowds." I've fallen for it too, writing about the hive mind and how quickly a group can come together to solve a given problem. It is easy to be seduced by the aggregated brilliance. I mean after all, how can you question the resource responsible for contributions like Metacritic, Cheez Whiz and Abba?

The crowd wisdom concept did not start inside Zynga. It is attributed to Sir Francis Galton who held a contest at a county fair to guess the weight of a cow. While the guesses of livestock experts varied widely, and none were close, the average of the 1000 guesses came within a single pound. The same experiment has been repeated over and over with people guessing the number of jelly beans in a jar and any number of random objects in equally random containers. This should give us unwavering certainty in a crowd’s ability to determine “average.” Crowds are really good at averages and lowest common denominator. Unfortunately, their power and scale make them even better at quashing new ideas, and innovation. The Japanese say "the nail that sticks up gets pounded down." While crowds can be helpful in addressing issues, they suck when it comes to framing it and we cannot confuse the power to determine mediocrity with the ability to innovate. Innovation happens on the edges of the bell curve. Crowds are the big bubble in the middle. You know, the objects of politicians and television network’s pandering.

Innovation must be strong. We like to use the word “disruptive” a lot. The crowd is a vicious predator of innovation. Crowds ostracize our greatest innovators, the people whose work advanced humanity. When Francesco Redi started his experiments, science “knew” maggots spontaneously generated from decaying meat. Abiogenisis was a known fact since the time of Aristotle. The proof was easy and obvious. Set a piece of meat on a table and few days later there were maggots. Reddi did not believe it, so he put meat in covered jars as well as uncovered jars and proved them wrong. The heretic became the hero, or in Einstein’s words, “to punish [him] for [his] contempt for authority, fate made [him] an authority.” The same can be said for Copernicus, Galileo, Newton, Darwin, Einstein, Pasteur, Salk and Stiglitz. Pioneers of silly ideas like the Earth revolving around the sun or injecting disease from cows to cause immunity. They all made great contributions by not listening to the crowd and today, we remember their names and benefit from their conviction. You would think an educated world would spot this pattern and accept ideas from the edges, but it does not. Even though today’s scientists are no longer subject to death or imprisonment, excommunication remains a fact of life.

I first heard about the phenomenon in a lecture by Nobel Laureate Kary Mullis. Well. . . it wasn't really a lecture, it was kind of a lunch on the beach because Kary loved to surf and if people wanted to hear what he had to say, we had to go to the beach. A group of guys in suits sat at the USCD sponsored event on Wind and Sea beach in La Jolla. Kary told the story about a paper he wrote about AIDS. He started his research paper with the phrase "HIV causes AIDS." As a scientist he needed to find support for every statement, so he started to look for the study supporting the conclusion, and the only thing he could find was a CDC memo written by a non-scientist. It was a statement, not a conclusion and it was completely unsupported. Worse yet, Kary’s research disclosed cases of AIDS with no detectable HIV. When he started to ask about them, the AIDS community got very upset. Undeterred, Kary hypothesized AIDS is made up of tens or thousands of viruses, only a few are visible by current technology. He saw an HIV correlation, but no causation. He theorized AIDS was caused by a pooling of viruses, visible and invisible, through sexual contact into a giant toxic cocktail. In other words, he engaged in the scientific method. He established a hypothesis and published it to be tested. As you can find on the Wiki page about Kary, he gets the “red flag” argument: “Medical and scientific consensus rejects such statements as disproven.” The hypocrisy of the statement would be funny if it did not suck the oxygen out of every alternative theory. The support for these statements and others according the crowd sourced authority is “Confronting AIDS: Update 1988” from the Institute of Medicine of the U.S. National Academy of Sciences. But their argument is based on consensus, not proof:

New information about HIV infection and its apidemiology has emerged either not confirm or alter earlier impressions of the disease. One question that has been resolved is the causative agent of AIDS. HIV and AIDS have been so thoroughly linked in time, place, and population group as to eliminate doubt that the virus produces the disease. The committee believes that the evidence that HIV causes AIDS is scientifically conclusive.

Another report, from the National Institute of Allergy and Infectious Disease applied the Koch postulates to determine whether HIV caused AIDS. The postulates were developed in the 19th century as tests to prove a link between putative pathogenic agents and disease. But even in this report cited as conclusive evidence, the support for the third postulate -transfer of the suspected pathogen to an uninfected host, man or animal, produces the disease in that host – is given as “the polymerase chain (PCR) and other sophisticated molecular techniques have enabled researchers to document the presence of HIV genes in virtually all patients with AIDS, as well as in individuals in earlier stages of HIV disease.” I am not going to address the irony of the reference to the test Kary invented being used to undermine his credibility because the focus should be on the word “virtually.” This means some AIDS patients do not have HIV. Why attack the person who questions why? The purported support is not based on science, it is demanding faith in consensus. No matter how many wrong people agree upon the same wrong thing, it remains wrong. While Kary’s theory made logical sense, in light of the knowledge, or lack of knowledge available in the early nineties, it was too late. The research community latched on to the idea of HIV causing AIDS and research dollars were only available to studies pursuing the HIV/AIDS link. In fact, the pull was so strong; cancer researchers were reframing their cancer studies as AIDS research into Kaposi's Sarcoma and other cancers common to AIDS patients. As a result, AIDS research since the eighties has deviated little from the HIV connection. The very definition of AIDS has evolved from a syndrome to a disease defined by the occurrence of HIV.

I don’t know whether HIV causes AIDS or whether AIDS is a syndrome or a disease. I do know that after the billions of dollars spent in a single direction we are yet to find a cure or vaccine against a disease that’s been killing people since the early eighties. Sure, we can point to prolonged lives for HIV positive people, but if Kary and others are correct, they never would have developed AIDS in the first place. What would be wrong with allowing a couple of Galileos to have a look in a different direction? When no one has an answer, are the pursuit of alternative theories and the consensus theory mutually exclusive?

Through the control of dollars, consensus rules every area of science. I found this really neat website with all different ways to visualize complex data sets. The site displayed this image addressing the scientific consensus around human caused global warming. The image powerfully displays the consensus counter argument against new ideas. In this case, climate change. It kind of gives you and idea of what it is like to be the guy with the alternative hypothesis. "All the cool kids say your wrong." Reliance on consensus must be seen as a red flag in any argument.Sometimes, we get battling armies of scientists from both sides as if the volume of scientists in agreement make a difference. Science is beautiful because no matter how many people believe something to be a certainty, it only takes one person to prove them wrong. It is true 90 percent of published climate scientists surveyed believe in "human caused" global warming. But, as Richard Lindzen, one of 11 scientists responsible for the oft cited 2001 National Academy of Sciences report on Climate change said:

But--and I cannot stress this enough--we are not in a position to confidently attribute past climate change to carbon dioxide or to forecast what the climate will be in the future. That is to say, contrary to media impressions, agreement with the three basic statements tells us almost nothing relevant to policy discussions.

Like Mullis, Lindzen, despite acknowledged brilliance, years of research and peer reviewed support often carries the word “naysayer” when his name appears in print. But he is right. The agreement is characterized as “human caused warming” because they do not agree carbon is the cause and they do not even agree with the pace of warming. In the presence of such great uncertainty and cataclysmic peril, unsupported facts get mixed in with scientific data, and when spun around and repeated enough, they become "true." Just as in the case of the AIDS infrastructure rising from an unsupported fact, in its 2007 the IPCC included a similarly unsupported statement that the glaciers in the Himalayas would melt entirely by 2035. The finding came from a popular British magazine, not from peer reviewed literature. Even though it was unsupported the statement provided a much needed visual to a cause in need of public support and understanding. All of the focus on carbon created a financial, scientific and political industry around management and reduction of carbon, which is suffocating any other analysis or efforts in the area of climate studies. In reality, our climate system is one of the most complex systems we will encounter in our lifetime. Carbon may be the cause, or it may be something we cannot even see or comprehend.

The truth is, we just do not know and the pursuit of a carbon-based solution may be pushing a solution deeper into the future. More significantly, the research and propaganda are leading the public to believe that if we change enough light bulbs and drive enough Priuses, things will get better. Unfortunately, this is sadly untrue.

If the scientists claiming carbon is the major contributor to warming are correct, we are too late. Reduction to zero will not impact the warming trend during our lifetime or likely even our children’s. If the carbon folks are right, the planet is getting warmer and reducing carbon will not help. If the carbon folks are wrong, the planet is still getting warmer and this will not be the first time. We are only parasites to an Earth to an uncaring Earth.

This picture is Edwards Air Force Base desert just outside Los Angeles. It is built on “Lake” Rodgers. The climate changed and this lake, along with the arid farmland a world away in the Sahara disappeared. When the lake dried the shrimp living in the lake burrowed into the ground and waited for rain. It only happens every 25 years or so, but when it rains, they came out. There were no climatoligists around to talk about warming and no scientists theorizing on how to stop it. The shrimp adapted to the hotter, drier climate, and the people adapted. Why aren’t we looking to do the same? There is no reason to stop the carbon research or to be more conscious of the footprint we make, but should the efforts toward living in a changed world be dwarfed so greatly by the hundreds of billions and perhaps trillions of dollars going into carbon reduction? How about some of those dollars going to living in a warmer world? Some more can go to relocating those folks who are losing their entire countries in the Pacific?

So who are the crazy ones? When the crowds are so vocal, a special superpower is often used to fight against them. The very people who are shunned by the crowd for being different are the ones who move us forward. Fortunately, people with these powers have been walking around for years. They were described by a guy named Hans Asperger:

"for success in science or art, a dash of autism is essential. The essential ingredient may be an ability to turn away from the everyday world, from the simply practical and to rethink a subject with originality so as to create in new untrodden ways with all abilities canalized into the one specialty." (Asperger 1979, p.49.)

Einstein agreed when he said “A foolish faith in authority is the worst enemy of truth.”

Asperger defined a syndrome characterized in part by an inability to understand peer pressure or be political - one half of the wonder twin power that makes people with Aspergers so valuable in the face of mob rule. The very attributes causing them to be shunned from the crowd are accompanied by the power to ignore the crowd. The other half is the ability to recognize patterns others do not and focus on details while unable to see the big picture. When those powers touch in a person who is willing to spend long hours alone in a lab or in front of a computer, you get very different, and often great results. It should be no surprise then that in his book Asperger’s and Self Esteem: Insight and Hope through Famous Role Models. Norm Ledgin, claimed Marie Curie, Albert Einstein and Mozart all had Aspergers – his earlier book, Diagnosing Jefferson, described why Thomas Jefferson likely had it. Others have added Shakespeare, Jane Austen, Darwin, Galileo, Picasso, Benjamin Franklin, Margaret Mead, Aristotle and Bill Gates to the list of contributors aided by Aspergers, and according to Wired Magazine, a growing chunk of Silicon Valley.

It is irresponsible to believe everyone diagnosed with Aspergers or some other spectrum disorder is going to change the world., But social media is elevating metrics over innovation and connecting and amplifying the voice of the status quo, and by extension the march toward mediocrity. We should all aspire to embrace a little Aspergers into our every day thought. Some times lowest common denominator is not the answer.

Thursday, February 17, 2011

Egyptian Uprising, The Game: The Gamification of a Revolution Edition

This morning we woke to news of President Obama heading to San Francisco to meet with these guys:

*John Doerr, Partner, Kleiner Perkins Caufield & Byers

*Carol Bartz, President and CEO, Yahoo!

*John Chambers, CEO and Chairman, Cisco Systems

*Dick Costolo, CEO, Twitter

*Larry Ellison, Co-Founder and CEO, Oracle

*Reed Hastings, CEO, NetFlix

*John Hennessy, President, Stanford University

*Steve Jobs, Chairman and CEO, Apple

*Art Levinson, Chairman and former CEO, Genentech

*Eric Schmidt, Chairman and CEO, Google

*Steve Westly, Managing Partner and Founder, The Westly Group

*Mark Zuckerberg, Founder, President, and CEO, Facebook

They say he is going to talk about spurring innovation and the economy, but coming on the heels of the Tunisian and Egyptian uprising and with growing unrest throughout the Middle East, I would hope he is talking about something else. These are the leaders of the tools used to bring down the Egyptian Government. On the one hand let's hope he is trying to figure out what happened in the interest of protecting his own job. Tea Parties are one thing, but. . . . On the other hand, is the uprising proof of the superior efficacy of digital tools over boots on the ground? To the leaders of the world this is all new. To gamers, it is just Tuesday. While game makers like Scvenger, Blippy and Foursquare "gamify" life and others theorize about it, the Egyptians gamified a revolution.

Jane McGonigal is promoting her book, Reality is Broken, by pointing out the value of 21 hours a week of gameplay. She argues gamers are developing skills that are useful in the real world and all we have to do is build games to let them solve the problems. However, the gamers skipped the game. The uprising's use of the social web, and for that matter, the uprising itself, is no different from what gamers know as alternative reality games, or" ARGs" - without the "A." Just off the top of my head can point to ARG's for Hellboy and Batman that organized and drove people to real world protests based on a fictitious fact set and imaginary cause. In case anyone ever wondered what would happen in these scenarios if the stakes were real, we saw it played out in Egypt. The leaders of the revolution used social tools to spread a message, gain credibility and encourage protest. If this were an MMO we would say they leveled up, built a guild and went on a quest.

This Internet thing can be scary. Western governments all call for the growth of democracy and the Internet has done exactly that, whether the rest of the world likes it or not. We say the leaders in a democratic administration serve at the will of the people but we are just now providing the voice to show what this really means. Governments are trying to assert sovereign power based on borders in a borderless world. The Egyptian government learned they could not turn off the Internet. The Jordanian government learned they should listen to the voice of the people - hopefully they did it in time.

Don't get me wrong, job creation is important, but if I was in a room with those people tonight, I sure wouldn't be talking about jobs.

Thursday, February 3, 2011

Jeetil Patel Say Sell EA: Time to Buy ERTS Edition

I owned EA stock for a while. It happened to be wrong while for profit but a very good while for someone looking for a very good way to offset gains in the rapidly rising market. The stock flatlined since I sold it and I did not pay attention to it until I saw the "sell" articulated by Jeetil Patel of Deutsche Bank. Based on his past performance, this rating is a stronger buy indicator than the company's balance sheet could ever be. Forget the forecast he issued the day before the earnings release which was way off, and excuse him for not knowing EA would announce a USD 600 million stock repurchase - no one could anticipate that. These are only embarrassing. Focus on his reasoning. It is bordering on - I would really like to use the "R" word but it is so politically incorrect now and I do not want to say criminal because it really is not - fraud to hold oneself out as an expert and provide an analysis as stupid as Baron's reported him to provide.

Patel’s concern with Electronic Arts is that their product line is simply too broad, in contrast to Activision, where single hit titles have taken the lead, such as Call of Duty: Black Ops, which swept industry sales over the holiday, according to the same NPD report cited above.

This made an awful lot of sense when I heard it from Steve Jobs, but he was talking about an inventory carrying manufacturing company. Jobs explained the need to take Apple's product line down from tens of variations on the Mac, to four simple products. Mr. Patel seems to forget, or never learned, EA and Activision are in the entertainment business. Most people with a brain in their head see strength in numbers when it comes to franchises in an entertainment company. If Mr. Patel wrote this last year he probably would have admired the two prong attack of Guitar Hero and Call of Duty. Does his reasoning mean Activision is better for the collapse of a billion dollar market? If this is the case, then Activision should really focus on burning out Call of Duty so it can put all of its effort into Warcraft. While they are at it, tell Blizzard to scale back on Starcraft and stop production on the new MMO.

The shame lies in Mr. Patel’s failure to identify, or acknowledge what is turning out to be the only publisher positioned to move forward into the next decade. Every other publisher that fell from the number one slot either disappeared or continued as a hollow shell of what they once were. EA looks like it may be turning itself around. There is a glimmer of hope in their core business, and unlike the other publishers who are either in denial or chasing their own tail in social, iOS and freemium, the places where money is being made, EA is making money and leading the field.

A few clicks of light research would have revealed Dead Space 2, EA's strongest launch in years to Mr. Patel. With Dead Space, EA shows an EA internal team other than Bioware is capable of launching a title in the 90's. No small feat considering as only Take Two and Ubisoft join EA on the list of publishers with multiple internally developed franchises scoring above 90 and each of those companies spent significantly more than EA to get there. Couple this with the Syfy feature and a top selling iOS title and it looks like the breadth of EA's production did not preclude their ability to properly market and support the franchise either. This glimmer of hope in the core business is interesting, but the rest is exciting.

Even though this post was read by hundreds of people from the EA domain, I can't take credit for the company's focus on digital distribution. Riccitiello has been talking about it for years now. But in a very unusual move for a publishing CEO, he is actually following through on his promise. The company showed over USD 200 million in revenue from digital distribution and is still on track to show USD 750 million for the year. Perhaps more significantly, EA dominates the app store and while it is a distant third to Zynga, it is the only console publisher in the top 15 developers of social games. Each of these delivers significantly higher margin revenue than console and provide an opportunity for game makers to take risks on new franchises.

I am not saying I am ready to put in my purchase order, but I am certainly ready to let Mr. Patel and the other naysayers know, the game publishers are not quite dead yet.