Skip to main content

DigitalOcean wants to open the floodgates to AI for smaller enterprises with Nvidia H100 access

Credit: VentureBeat made with Midjourney
Credit: VentureBeat made with Midjourney

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


AI isn’t just for the largest of enterprises, it’s increasingly being used by small and mid-sized businesses (SMBs) too.

DigitalOcean has built its cloud business with a focus on developers that cater to SMBs and it’s now growing that business with new support for the industry leading Nvidia H100 GPU AI accelerator. The Nvidia H100 will be available on the Paperspace platform, which was acquired by DigitalOcean in July 2023 as an entry point into the AI space. Paperspace has already been providing access to GPUs and also has a data science platform known as Gradient.

The new offering provides access to the highly sought-after H100 GPUs on Paperspace, in an approach that can virtualize the GPUs, making them more accessible to developers. With their extremely high performance, H100s can significantly reduce training times and inference response times for AI models. 

With the current hype around generative AI, demand for Nvidia H100 GPUs is extremely high, which leads to challenges for smaller organizations to get access.

“For these companies, when they need very powerful GPUs, the way things stand now it’s actually quite tough to get access to them and so they find their own business growth gets  restricted right and that’s where we come in,”  Kanishka Roychoudhury, GM of AI/ML at DigitalOcean, told VentureBeat. 

Taking a virtualization approach to H100 GPU access

To put it mildly, the Nvidia H100 GPU is a beast in terms of capacity and performance. 

Not every small business or developer needs a full H100, but they might want some fraction of the power and AI capabilities.

“Smaller startups come in all shapes and sizes,” Roychoudhury said

The DigitalOcean Paperspace platform is taking a virtualization approach to enabling the right level of access and performance to GPUs for whatever an organization might need. 

Roychoudhury explained that each physical Nvidia H100 GPU can be virtualized to provide up to eight separate GPU instances that can be used by developers. Organizations can choose to provision a fractional amount, say a single instance, or a full H100 that uses all 8 slices.

How SMBs are building with AI today

SMBs are already making use of the DigitalOcean Paperspace platform to build various types of AI applications.  Roychoudhury hopes that the new Nvidia H100 support will extend and expand the applications that are built.

Roychoudhury mentioned that DigitalOcean and Paperspace customers from small businesses and startups are using the platform to build a wide variety of AI applications, not just chatbots. One example he provided is building a video augmentation service for translating video between languages.

In terms of how the H100 GPUs will specifically be used, Roychoudhury said he expects that they will primarily be used for AI model training due to their high performance. However, he noted that some customers are already using the less powerful Nvidia A100 GPUs for inference workloads as well, and it’s likely the H100 will be deployed to support inference as needs grow over time. 

What’s next for AI at DigitalOcean

The Paperspace acquisition by DigitalOcean is still relatively new, with integration efforts still underway. 

Roychoudhury expects that there will be more integration in the months to come, but he emphasized that the platform works together already. For example, GPU resources, including the new Nvidia H100 machines can be deployed via Paperspace with access to storage resources running in DigitalOcean compute.

There will also be further integration and availability of the H100. Initially the high-end Nvidia GPUs will only be available as infrastructure resources. The plan is to make the H100 GPUs also available via the Paperspace Gradient data science platform.

Roychoudhury explained that Gradient is a notebook-based environment for data scientists and engineers to build, train, test and deploy AI models. When launching a Gradient project, developers get basic computing resources and can upgrade to add more powerful resources like GPUs. Currently Gradient integrates with Hugging Face to allow downloading and fine-tuning pre-trained models.

The overall direction that DigitalOcean is headed is all about reducing the friction to getting access to high-end AI tools and hardware, like the Nvidia H100.

“We want to actually make this very easily available and affordable to the smaller companies,” Roychoudhury said.