Subscribe to our Newsletter!
By subscribing to our newsletter, you agree with our privacy terms
Home > IT Monitoring > Data Center > Equinix and NVidia offer datacenter service for private AI
February 29, 2024
Are you looking for high-performance environments to create and run personalised Generative Artificial Intelligence (AI) models, but don’t know where to start, especially when it comes to infrastructure, application development tools and data security? To meet this demand, Equinix, a global provider of digital infrastructures, has started offering a new private cloud service that combines NVidia solutions, including the DGX platform, network resources and AI Enterprise software.
Customers will be able to have their NVidia infrastructures deployed and operated in Equinix International Business Exchange’s (IBX) worldwide network of interconnected datacenters. With Equinix’s new service, companies will be able to scale their infrastructures according to the level of performance they want for their AI models.
Through this new service, it will also be possible to easily access NVidia AI Enterprise software to develop and deploy Generative Artificial Intelligence applications, using pre-trained models, optimised structures and data science software libraries. With this, customers will be able to create what Equinix calls private AI to capitalise on the benefits of AI, while maintaining the confidentiality and security of data in environments that are not publicly open.
“To exploit the incredible potential of Generative AI, companies need adaptable and scalable hybrid infrastructures in their local markets to bring AI supercomputing to their data,” says Charles Meyers, president and CEO of Equinix. “Our new service offers a fast and cost-effective way to adopt advanced AI infrastructures operated and managed by experts around the world.”
“Generative AI is transforming every industry. Now companies can count on NVidia’s supercomputing resources and AI software, combined with the efficiency of Equinix management in hundreds of datacenters around the world,” says Jensen Huang, founder and CEO of NVIDIA.
The service will guarantee access to high-speed private networks for global service providers and enable the rapid retrieval of AI information over long distances. In addition, it will provide high-bandwidth private interconnections for cloud services and business service providers and thus accelerate the execution of AI workloads while ensuring compliance and data security.
According to Equinix, organisations that are already using the new service are leaders in the biopharmaceutical, financial market, software, automotive and retail sectors that are setting up AI Centres of Excellence to evaluate LLM (Large Language Model) use cases. With these AI initiatives, they hope to speed up the launch of new drugs, develop AI co-drivers for customer service and create virtual assistants to raise productivity levels.
The new service is already available through Equinix or NVidia representatives.
In general terms, we can say that private AI meets the growing demand for responsible and sustainable Artificial Intelligence solutions, after the first wave of enthusiasm, combined with the growing concern about data management and protection that are at the centre of any strategy in this area. In short, how can we maximise the value of AI models without exposing data or putting it at risk? The answer given by Equinix is: by resorting to private AI.
With the arrival of large language models just over a year ago, used, for example, by ChatGPT, public AI also emerged, catering for many different use cases but sharing the same models and, possibly, data. “By feeding public AI models with your company’s proprietary data, you agree to make them public, whether you know it or not,” explains Ruth Faller, vice president of development and strategy at Equinix.
Now, some organisations have begun to recognise the limitations imposed on their strategies by public AI. The concept of private AI is emerging, which refers to an environment built by or for a specific organisation to be used exclusively by it.
The three main advantages of a private AI scheme are:
1. Protect proprietary data: Unlike public AI models, which can make the data they use public, private AI uses a private data architecture that will always be under the owner’s control and processed exclusively for their own benefit.
2. Mitigating regulatory risks: Storing and processing large volumes of data, as is the case with AI models, proportionally raises regulatory risks throughout the data lifecycle. The approach used by private AI employs architectures that ensure complete control over the data, specifically in relation to which physical locations the data is stored, which equipment is responsible for processing and moving it, who has access to it and for what purposes.
3. Optimising performance and costs: Private architectures can be adopted to minimise problems with feeding AI models which, in the case of public AI, can involve different environments and interconnection systems and result in delays. In the case of private AI systems, they can be built to be kept close to data warehouses, guaranteeing a consistent, low-latency data flow, often without the need to pay third parties for connectivity resources.
In its blog, Equinix explains that there is no need to keep a private AI environment completely isolated from public clouds. The advantage is precisely that you can connect to public clouds on your own terms, keeping custody of the data and moving it to the cloud when necessary and via private, dedicated network connections.
August 23, 2024
August 17, 2024
August 15, 2024
August 14, 2024
August 09, 2024
July 31, 2024
July 27, 2024
Previous
Cisco and Nu Quantum invest in modular quantum datacenters
Next
Researchers propose innovations for datacenter cooling