Highlights:
- Enterprises developing AI software need both high-performance chips and streamlined deployment for their neural networks. Acquiring Nod.ai enhances AMD’s capacity to meet this requirement efficiently.
- AMD is actively striving to expand its presence in the AI chip market. The company’s Epyc central processing units come equipped with integrated machine-learning enhancements.
Nod.ai, a startup that creates open-source software for accelerating artificial intelligence models, has recently been acquired by Advanced Micro Devices Inc. for unspecified financial terms.
The creator of the SHARK open-source AI tool is Nod.ai, formally known as Nod Inc. It is a so-called runtime system made to speed up neural network inference. Nod.ai claims that SHARK can perform noticeably better than other open-source tools created for the same task.
Programs like AI models must be packaged with a number of auxiliary software components to be deployed. A program called a runtime system contains many of the auxiliary parts of a program. The SHARK runtime system from Nod.ai is explicitly designed with neural network requirements in mind.
In a blog post from last year, the business claimed that SHARK makes it possible for AI models to operate more than three times faster than PyTorch and Torchscript. A popular framework for creating neural networks is PyTorch. A complementary technology for enhancing the inference speed of neural networks is Torchscript.
Nod.ai claims that SHARK can outperform XLA in terms of performance. The latter technology is leveraged to enhance the efficiency of AI models built using TensorFlow, a well-known alternative to PyTorch.
In addition to supporting SHARK, Nod.ai also contributes to several other open-source initiatives. These initiatives aim to speed up the coding of AI applications.
Developers transform their source code files into executable programs using a compiler tool. The compiler translates the code into an “intermediate representation,” often described as an abstract depiction of the program’s functionality. It then converts this conceptual description into executable instructions for a computer’s processor.
One of the open-source initiatives that Nod.ai contributes to is Torch-MLIR, an intermediate representation explicitly designed for AI use cases. The business also contributes to IREE, a compiler that technically has much in common with Torch-MLIR. The source code of a neural network can be converted into a working program that can run on graphics cards and other AI-optimized chips using IREE by developers.
AMD is making a concerted effort to grow its market share for AI chips. The company includes built-in machine learning optimizations in its Epyc central processing units. Additionally, AMD offers specialized AI processors like the recently launched MI300X, which is made to compete with Nvidia Corp.’s leading graphics cards on the market.
Businesses creating AI software need fast chips and a straightforward method to set up their neural networks on those chips. With the addition of Nod.ai, AMD might be able to meet that need better. One of the trickiest parts of deploying AI software is optimizing neural network performance, a task Nod.ai’s SHARK tool simplifies.
Senior Vice President of AMD’s Artificial Intelligence Group, Vamsi Boppana, said, “The addition of the talented Nod.ai team accelerates our ability to advance open-source compiler technology and enable portable, high-performance AI solutions across the AMD product portfolio. Nod.ai’s technologies are already widely deployed in the cloud, at the edge and across a broad range of end point devices today.”
Following the acquisition, AMD might try to improve Nod.ai’s SHARK tool. For instance, the chipmaker could improve AI models running on its processors that SHARK powers.
Competitors of AMD are active in the open-source community. For instance, Intel Corp is one of the major corporate code contributors to Linux and Chromium, the open-source project that serves as the foundation for Google Chrome.
The Nod.ai purchase may aid AMD in making its AI accelerators more competitive with Nvidia’s graphics cards at the go-to-market level. In addition to its chips, the latter business offers a software platform called Nvidia Enterprise AI that makes it simpler for customers to enhance the performance of their neural networks. The platform includes various tools, prepackaged AI models, and scientific computing software.