Close

New Workspace Modernization Research from CDW

See how IT leaders are tackling workspace modernization opportunities and challenges.

Dec 19 2025
Data Center

The Importance of Data Centers for Healthcare Hybrid Infrastructures

In healthcare, on-premises data centers are becoming as critical as the cloud for fast, reliable access to clinical data and scalable hybrid infrastructure.

Health systems are expanding artificial intelligence (AI) initiatives across imaging, diagnostics and patient-facing workflows, underscoring the shifting role of the on-premises data center from a legacy infrastructure component to a critical performance layer.

The evolution is driven by rising clinical workloads that demand faster data access, tighter service-level expectations and compute capacity positioned closer to the point of care.

It’s a shift that reflects both longstanding realities in healthcare IT and the accelerating demands of care delivery enabled by AI, says Murali Gandluru, vice president of product management, data center networking at Cisco. He notes that care teams and patients depend on applications and data that are increasingly distributed across environments.

While cloud services — public, private or hosted — now power a significant portion of healthcare software and productivity tools, that model is not sufficient for every workload, particularly those tied directly to patient care.

DISCOVER: Accelerate innovation with an AI-ready infrastructure delivered by Cisco.

Organizations are beginning to distinguish between cloud-delivered AI meant for general productivity and AI models that must perform closer to clinical systems. That shift is reshaping where AI runs and why.

“The performance and scale elements are driving a lot more on-prem use cases for solving problems close to the touchpoint between the doctor and the outcomes,” Gandluru explains.

Reduce Delays and Support Clinical Workflows From the Data Center

Local compute reduces delays in accessing diagnostic data and supports time-sensitive workflows across radiology, labs and clinical decision systems, to name just three. As AI becomes embedded across clinical operations, Gandluru sees healthcare leading innovation around distributed infrastructure and edge-aligned AI.

“You can uncover insights faster and more efficiently when AI systems are positioned closer to the edge,” he adds.

Gandluru says as health systems move deeper into AI deployments, chief intelligence officers must decide how to strengthen core infrastructure to support training, inference and real-time clinical analytics. For him, the starting point is governance — not GPUs.

Click the banner below to read the new CDW Artificial Intelligence Research Report.

 

“The key things to think about for any IT leader, especially in healthcare environments, is to ensure the systems you’re building are a continuance of the governance and compliance requirements your healthcare institution has,” he says.

With AI drawing on increasingly sensitive data sets, he argues that data protection must be baked into the foundation.

“Protecting your AI/ML assets and protecting your data must be done from the first principles,” he says, noting that organizations should only decide where AI workloads will run after creating a foundation of good governance.

Considerations for Server Management and Enterprise Operation

The shift to on-premises also changes how health systems approach server management and physical design.

“If you are building out small rooms within your site, as a branch office of a healthcare provider or a remote hospital, you need to think first about the hardware or facilities level — power, cooling, all of those elements,” he says.

The harder challenge, Gandluru notes, is operating these environments at enterprise scale.

“From a CIO or CISO perspective, you want to be able to operate at scale, and that automatically means you’ve got to have all of the automation elements baked in,” he says, adding Cisco’s own experience illustrates the need. “We’ve supported 500 sites with our Nexus ACI infrastructure and have been able to provide automation of not only day-zero provisioning but day-one and day-two monitoring, visibility and quick resolution of where a problem is.”

For servers, the principles are the same: Whether server management is delivered from the cloud or on-premises depends on the particular use case, pointing to smaller, edge-friendly designs as another growing component.

“The Cisco Unified Edge platform allows you to have CPU, GPU, network and storage in a small form factor,” he notes, emphasizing that these systems must still connect to centralized management systems to avoid sprawl.

READ MORE: Meet the demand for modern data centers in healthcare.

Avoiding the Snowflake Effect in Healthcare AI Infrastructure

From Gandluru’s perspective, building AI infrastructure is the easy part; running it sustainably is the challenge.

He points out that the same IT professionals who will be managing AI infrastructure will also be managing virtualization or cloud infrastructure.

“They have developed best practices that they’re comfortable with,” he says. “It’s important to expand those best practices into AI environments.”

Gandluru says hybrid design depends on preventing AI and edge sites from becoming one-off deployments.

“It is important to make sure that you are not ending up managing each of these edge-inferencing environments as snowflakes,” he says. “They must fit into the existing systems and processes.”

He explains that IT teams should extend familiar operational practices, governance models and security controls across both cloud and on-premises environments so they can consistently troubleshoot and maintain compliance.

“It’s about seamlessly expanding your existing policies across this new environment,” Gandluru says.

Brought to you by:

quantic69/Getty Images