I had to laugh out loud recently when I read about the legal case against Anthropic. It was about the fact that they used books to train their AI. Google started its book scanning project, codenamed “Project Ocean,” in 2002. This project aimed to digitize a vast number of books and create a massive digital library. In 2004, Google officially announced its Library Project, partnering with major research libraries like Harvard, Stanford, Oxford, the University of Michigan, and the New York Public Library to scan books from their collections.
The project grew quickly. By 2005, Google had already scanned more than 10 million books. The project has faced some legal challenges, particularly concerning copyright, including a class-action lawsuit from the Authors Guild of America, which Google ultimately won. As of October 2019, Google announced it had scanned over 40 million titles. The company has a long-term goal of scanning all 130 million distinct books estimated to exist in the world.
That’s the first reason nobody comes close to Google for sheer input data. They set out to “digitize the planet’s data” a long time ago and they have data Open AI can’t even imagine getting. Ever.
The second reason is efficiency. People love talking about how power hungry AI is. Guess what? Google has been optimizing for decades now. It is not a one off that Open AI asked them to use their infrastructure. I remember maybe 20 years ago I ran into some friends of friends in Holland. Database experts, they had developed a little trick to make things slightly more efficient and…Google had bought it instantly.
Unless you are some iPhone toting schmuck that only uses Chat GPT you know this already. Gemini is just so much faster. And they keep lowering the cost per query. GPT-4 is estimated to be around 0.42 Wh per query, while a long prompt can be much higher, sometimes exceeding 33 Wh. Google Gemini 2.0 Flash is estimated to be highly efficient, with energy use as low as 0.022 Wh per query. Claude 3.7 Sonnet ranks high in eco-efficiency among its peers at 0.81 Wh. (All these figures with massive pinches of salt but you get the picture.) And then there is the energy cost of training of course where again Google is years ahead because, well, Google has been doing AI for years. (Auto replies in Gmail , Google Photos, etc.)
The third reason Open AI can’t catch Google is our fav, the network effect. The minute they switched all Google searches to become AI searches it was clear. They are not messing around. Not by a long stretch can any other company come close to the amount of data Google has on what people want to know. And they have been optimizing the answers they provide for decades. They were probably laughing at the recent user negativity to Chat GPT 5. Google has dealt with stuff like this since it’s inception, always experimenting with ways to present results.
So will Open AI die? Of course not! Google knows how to play the anti monopoly game. Open AI will take it’s share, specialize in specific sectors. Google will leave the high end corporate market to others too. Agentic AI too, maybe just keep a finger in the infrastructure and protocols. But let’s not kid ourselves. When you look at the big picture, Google is years ahead of everyone. Open AI opened Pandora’s box sooner because that was the only play they had.






Everyone says “Top Gun” when you speak about plane movies, but by the end, this is more like “ET” meeting the “Iron Eagle” trilogy. Epic stuff. People smile at the right time, salute, explode and laugh just when they should be. Well made movie. For its kind. As long as you don’t try and relate anything you see to anything in the real world, geography, politics or technology. They even threw some romance in for good measure.