If there's a single common element to most successful data centers, it's at least one admin with a touch of OCD. At some level, obsession and compulsion are necessary to keep things running smoothly and reducing the propensity for the situation to go haywire.
It's not just about keeping a close and detail-oriented eye on the software and systems, but also paying attention to how the data center is constructed and maintained. Basically, it comes down to determining when to put a "more urgent" project on hold to rip and rework a completely different subsystem, for no other reason than it wasn't done right.
One area where this has a tendency to crop up in general-purpose data centers revolves around cabling. When the data center was built, the cabling was immaculate. It was color-coded, neatly patched, wrapped to ladder racks and top-of-rack guides, gracefully strung down the rack sides, and carefully secured to each server and switch with an aesthetic that might just qualify as art. But that was The Before Time. That might have been two or three years ago, several server generations ago, several storage arrays ago, and possibly even several admins ago.
Now, like a Roman ruin, the origins of an expertly constructed and beautifully managed data center are still visible, but mired in the slough of the intervening years. The color-coding has long been rendered meaningless, though some vestiges remain.
Where a red cable used to always signify an unsecured link, now it might be an internal management link or even storage. Cables draped along those ladder racks have been covered over with a mishmash of other wiring, some of it coiled up in various places because there were no 15-foot cables to be had. Instead, 25-foot cables were called into service, their excess length gathered into haphazard rings and ziptied together so that no human may trace them ever again.
But this isn't the case in all data centers. Some have the benefit of being highly fixed-purpose, where the equipment turnover is low, and upgrades usually involve a forklift. If every rack has and will always have 40 servers, you can probably reuse the original cabling without much fuss. But in data centers with high turnover of heterogeneous equipment, it can be a real challenge to maintain coherency in the face of a revolving door of gear, all with different needs.
In those spaces, it's not uncommon to see bundles of cables hanging unattached within racks, yanked out during a refit or emergency, and left to dangle for months or years until someone asks where they go. Sadly, nobody really knows anymore.
In other racks, there might be a collection of links heading into smaller switches run so tightly that the switches can't be removed from the rack without removing the cabling as well. When one of those switches goes bad, it becomes a much more tedious task to replace it.