Nvidia announces humanoid robotics, customer AI infrastructure offerings at Computex 2025

6 hours ago 2

Daniel Howley

Sun, May 18, 2025, 11:00 PM 3 min read

In This Article:

Nvidia (NVDA) rolled into this year’s Computex Taipei tech expo on Monday with several announcements, ranging from the development of humanoid robots to the opening up of its high-powered NVLink technology, which allows companies to build semi-custom AI servers with Nvidia’s infrastructure.

The announcements come as Nvidia rides a recent hot streak after the US announced it was scrapping the Biden administration’s AI diffusion rules that would have put limits on which countries could buy the company’s AI chips.

Nvidia was also a topic of President Trump’s visit to Saudi Arabia, where the company said it will supply several hundred thousand AI processors to Humain, an AI startup owned by Saudi Arabia’s sovereign wealth fund, over the next five years.

During the event on Monday, Nvidia revealed its Nvidia Isaac GR00T-Dreams, which the company says helps developers create enormous amounts of training data they can use to teach robots how to perform different behaviors and adapt to new environments.

Nvidia CEO Jensen Huang has said physical AI represents the world’s next trillion-dollar industry. And to get there, the company is leaning into building the software necessary to train and power humanoid robots in factories before they’re eventually available in our homes.

In addition to its robotics capabilities, Nvidia showed off its new NVLink Fusion offering, which allows customers to build custom servers using Nvidia’s Grace CPU and a third-party AI chip paired with Nvidia’s various server infrastructure offerings. Customers can also choose to pair their own CPU offering with one of Nvidia’s AI chips.

“Using NVLink Fusion, hyperscalers can work with the NVIDIA partner ecosystem to integrate NVIDIA rack-scale solutions for seamless deployment in data center infrastructure,” the company said in a statement.

The idea is to give infrastructure customers more options when it comes to building out their data center and server systems.

To that end, Nvidia is also working on what it calls its RTX Pro Blackwell servers. Nvidia says these servers, which run on the company’s Blackwell Server Edition GPUs, will drive “the shift from CPU-based systems to efficiency GPU-accelerated infrastructure.”

The systems, Nvidia explains, are meant to run “virtually every enterprise workload.” That includes use cases ranging from design and simulation software to running agentic AI programs and more.

The company also debuted its DGX Cloud Lepton, giving customers access to cloud-based AI processing, which allows users to develop and roll out their own AI software. Nvidia says it’s able to do this by working with partners, including CoreWeave (CRWV), Foxconn (2354.TW), SoftBank (SFTBY), and others, which will host a global network of GPU clouds.


Read Entire Article