Google just published its Q3 financial results. You can read it yourself at http://investor.google.com/earnings/2010/Q3_google_earnings.html
So, what is Google spending on IT, and how much servers would that buy? This is one of their best kept secrets. I looked at that earlier. Let’s have a new look. Some quotes:
Other cost of revenues, which is comprised primarily of data center operational expenses, amortization of intangible assets, content acquisition costs as well as credit card processing charges, increased to $747 million, or 10% of revenues, in the third quarter of 2010
In the third quarter of 2010, capital expenditures were $757 million, the majority of which was related to IT infrastructure investments, including data centers, servers, and networking equipment.
So let us modestly assume that half the capital and half the operational expense is server related, $400 million each. Let us assume a cheap Google server costs $ 1000, and the associated network, datacenter facilities and such, another $ 1000. The run cost of the datacenter (power, cooling, etc) could match that. This leads to an investment pattern of 200.000 servers per quarter, 800.000 per year. With an average lifetime of 3 years, this puts the ballpark estimate of the size of Google’s server farm at 2.4 million servers. There are entire countries that do not have that many servers. There are entire countries that do not have that many PCs.
Since 2004, the server farm increased in size by a factor of 16, while revenu increased 10 fold (see also my 2005 estimates). Once more, Google increases the amount of compute power that goes into a dollar of revenu, Moore’s law notwithstanding.