It's hard enough to build a solid primary storage strategy for a complex server environment, but that's nothing compared to the effort required to get your proposal through your organization's budgeting process. A refreshed storage architecture or a new backup infrastructure requires loads of capital, which generally means you have to get creative when you articulate requirements to management.
Yes, some sort of song and dance is almost always required. On the other hand, part of the problem is that we seldom do a good job educating management about how our systems live and breathe and what they actually cost to run. Obfuscation isn't in anyone's interest in the long run -- especially as the cloud matures and becomes a viable alternative to some segments of in-house infrastructure.
One reason this disconnect occurs is that management fails to grasp the relationship between the cost and criticality of a given application and its actual technical requirements -- storage or otherwise. Time and time again I have seen huge, massively expensive, mission-critical applications sit on equally huge and massively expensive storage infrastructures that are essentially idling under low load. Meanwhile, elsewhere in the data center, a collection of cheap tertiary applications are burning their storage resources to a crisp with loads many times that of the critical systems.
The reason this happens can usually be traced back to the budget process. When management is told that a mission-critical application will cost millions of dollars to acquire and implement, there's rarely much in the way of pushback over a beefy infrastructure to match. In fact, it's expected.
That's how the budgetary two-step was invented. IT departments everywhere use big ticket projects to justify the purchase of infrastructure those projects don't need -- in order to feed the needs of little projects that nobody would spend enough capital on otherwise.
Is that really such a bad thing, you ask? Management feels it has spent its money wisely, while IT gets the resources it needs to deliver solid performance and reliability back to the business. It's almost as if IT is doing management a favor by quietly doing the right thing -- seems like a win-win.
Unfortunately, this kind of budgetary shell game has serious drawbacks. By obscuring the "true" cost of running your infrastructure, you perpetuate a decision-making process that consistently yields poor, uninformed choices. It becomes all but impossible to adopt a chargeback system that reflects the actual cost of what IT does. And it feeds into the constant struggle between IT and management over what's truly worth spending money on -- especially when it comes to allocating manpower to manage infrastructure and applications. When you get right down to it, if you constantly feel your department is overworked and is never given enough time to finish anything, look in the mirror. Being less than brutally honest about what things actually cost probably caused that situation.
Of course, the alternative to obfuscation is not particularly pleasant, either. It takes time, effort, and a thick skin to disclose the real cost of running an IT infrastructure as it relates to applications and services. But if you do it right and management is receptive, far fewer lamentable purchasing decisions will be made.