A new report says the “always maxed out” image of data centers is more myth than reality. Could that misguide grid planning? (Power & Policy)
What it says: Load factor, utilization rate, and uptime measure different things, but they’re often conflated. A site with a 90% load factor might still run at only ~72% of true capacity. Underutilization stems from ramp-up periods, oversized designs, fluctuating workloads, and downtime from maintenance or hardware failures.
Why it matters: Without better reporting on real-world usage, utilities risk overbuilding generation and grid capacity for demand that never materializes, diverting capital from other reliability and infrastructure needs.