Time is money. It’s an old adage, but it still applies to everything from banking and high finance to hiring the neighbor’s kid to mow your lawn: The more time it takes, the higher the cost.
This is true for the cloud as well. The longer data volumes remain on third-party public resources, the greater the cost to the enterprise. The question is, at what point does reliance on the cloud start to exceed the cost of local infrastructure?
According to NetApp CTO Jay Kidd, not long. As he noted at a recent Wells Fargo gathering, archival storage at Google and Amazon runs about 2.5 cents per gigabyte per day, so within three months you are already pushing past the cost per gigabyte of an average disk drive. To be sure, the cost of local storage encompasses much more than just the drive, but for enterprises that have already borne the expense of infrastructure deployment, the economics of moving large amounts of data to the cloud starts to break down for anything but the most transient of applications. For large block volumes, the numbers are even less affordable—as high as $4 per gigabyte over a typical three-year hardware life cycle.