Salem Radio Network News Thursday, November 13, 2025

Science

Cisco rolls out chip designed to connect AI data centers over vast distances 

Carbonatix Pre-Player Loader

Audio By Carbonatix

By Stephen Nellis

SAN FRANCISCO (Reuters) -Cisco Systems launched on Wednesday a new networking chip designed to connect artificial intelligence data centers, with the cloud computing units of Microsoft and Alibaba enrolling as the chip’s customers.

The P200 chip, as Cisco calls it, will compete against rival offerings from Broadcom. It will sit at the heart of a new routing device that the company also rolled out on Wednesday and is designed to connect the sprawling data centers that are located over vast distances and which train AI systems.

Inside those data centers, companies such as Nvidia are connecting tens of thousands and eventually hundreds of thousands of powerful computing chips together to act as one brain to handle AI tasks.

The purpose of the new Cisco chip and router is to connect multiple data centers together to act as one massive computer.

“Now we’re saying, ‘the training job is so large, I need multiple data centers to connect together,'” Martin Lund, executive vice president of Cisco’s common hardware group, told Reuters in an interview. “And they can be 1,000 miles apart.”

The reason for those big distances is that data centers consume huge amounts of electricity, which has driven firms such as Oracle and OpenAI to Texas and Meta Platforms to Louisiana in search of gigawatts. AI firms are putting data centers “wherever you can get power,” Lund said.

He did not disclose Cisco’s investment in building the chip and router or sales expectations from them.

Cisco said the P200 chip replaces what used to take 92 separate chips with just one, and the resulting router uses 65% less power than comparable ones.

One of the key challenges is keeping data in sync across multiple data centers without losing any, which requires a technology called buffering that Cisco has worked on for decades.

“The increasing scale of the cloud and AI requires faster networks with more buffering to absorb bursts” of data, Dave Maltz, corporate vice president of Azure Networking at Microsoft, said in a statement. “We’re pleased to see the P200 providing innovation and more options in this space.”

(Reporting by Stephen Nellis in San Francisco; Editing by Muralikumar Anantharaman)

Previous
Next
The Media Line News
Salem Media, our partners, and affiliates use cookies and similar technologies to enhance your browsing experience, analyze site traffic, personalize site content, and deliver relevant video recommendations. By using this website and continuing to navigate, you consent to our use of such technologies and the sharing of video viewing activity with third-party partners in accordance with the Video Privacy Protection Act and other privacy laws. Privacy Policy
OK
X CLOSE