While the chip industry has seen plenty of head-turning M&A transactions over the last few years, an Nvidia-ARM deal might take the cake, given its potential strategic implications.
A little over a week after Bloomberg first reported of deal talks, Bloomberg and The Financial Times have both reported that Nvidia (NVDA) is in "advanced talks" to buy ARM, whose CPU core designs and other IP go into chips powering billions of devices shipping each year -- everything from smartphones to video cameras to hard drives to washing machines. The FT reports the companies are discussing a cash-and-stock deal that would value ARM at more than the $32 billion Japan's SoftBank paid for it in 2016, without elaborating further.
With the qualifier that there's no guarantee at this point that a deal will be inked, here are some initial thoughts about an Nvidia-ARM tie-up.
1. Nvidia's Soaring Stock Price Might Be Influencing its Actions
As of Friday's close, Nvidia -- following an 80% 2020 gain -- is worth $260 billion. As a result, if Nvidia financed an ARM deal primarily with stock, the transaction would dilute existing shareholders far less than it would have at the beginning of the year.
2. Nvidia Might Be Uniquely Positioned Among Big Chip Developers to Pull This Off...
There's virtually no chance that antitrust regulators would ever allow Intel (INTC) to buy ARM. And while a major chip developer/ARM licensee such as Qualcomm (QCOM) or Texas Instruments (TXN) might in theory be able to afford a deal, the fact that many of their biggest rivals also heavily rely on ARM makes such a move pretty unlikely in practice.
Nvidia, by comparison, gets the vast majority of its revenue from discrete GPUs rather than chips containing CPU cores. And its two biggest GPU/accelerator rivals -- AMD (AMD) and Intel -- develop CPUs based on the x86 instruction set rather than the ARM instruction set.
3. ...But There Would Still Be Challenges
Though its discrete GPUs don't contain ARM CPU cores, Nvidia's Tegra system-on-chips (SoCs), which can be found inside of infotainment systems and the Nintendo Switch, do contain them. So do processors found within its Drive computing boards for autonomous and semi-autonomous cars, and its Jetson computing boards for embedded systems.
These products in turn compete against ARM-powered silicon from many other chip developers. Nvidia would need to find a way to address this conflict of interest if it went forward with an ARM deal -- particularly given that the open-source RISC-V instruction set is making headway in the embedded/IoT processor space. And as many others have noted, Nvidia would also have to convince regulators that its ownership of ARM wouldn't be harmful to rivals.
Also: ARM has a mess on its hands right now with its Chinese JV, whose CEO has refused to step down after being fired and has hired security guards to prevent ARM execs from entering the JV's headquarters. Nvidia, which does a lot of business in China, would need to address this problem before it closed an ARM deal.
4. Nvidia Might See Opportunities to Make ARM More Profitable
While ARM's revenue has grown at a moderate pace over the last few years -- its 2019 revenue was around $1.9 billion, up from around $1.5 billion in 2015 -- its profits have fallen sharply, as SoftBank dialed up ARM's spending, particularly on R&D. Nvidia might see room to pare back some of this spending.
In addition, while it's unlikely to seek much higher licensing fees or royalties for low-power embedded chips at a time when RISC-V is knocking at the door, Nvidia might see room to increase the license fees and/or royalties ARM collects from chip developers selling processors featuring triple-digit or high-double-digit selling prices -- think processors powering hardware such as PCs, servers, high-end smartphones and mobile base stations. Two weeks ago, Reuters reported that ARM is attempting to raise license fees for certain unnamed clients.
5. Nvidia Could See ARM's CPU IP as a Missing Piece for Realizing its Data Center Vision
Nvidia's server GPUs now accelerate many different types of demanding AI, high-performance computing (HPC), machine learning and analytics workloads. And the company has repeatedly declared that it expects most servers to eventually feature one or more accelerators of some kind.
Meanwhile, with its recent acquisition of Mellanox Technologies, Nvidia obtained a company that (among other things) develops programmable silicon that can offload network processing -- and potentially other things, such as security, storage and virtualization functions -- from a server's CPUs.
That leaves CPUs as the one type of server processing unit that Nvidia doesn't have covered. Moreover, the coming years will see the arrival of server platforms (including ones from Intel and AMD) that support memory coherency across CPUs and GPUs (i.e., the ability of CPUs and GPUs to rely on a common pool of memory). This is an advance that stands to improve system performance and resource efficiency, while also making life easier for software developers.
In addition, during an insightful April interview about Nvidia's plans for Mellanox, CEO Jensen Huang candidly admitted that his company still sees CPUs (along with accelerators and SmartNICs) playing an important role in data center computing going forward, as data centers keep evolving to better allow different types of processing resources to be individually upgraded, pooled and reconfigured on the fly to support giant workloads.
Also, when asked if Nvidia, which several years ago abandoned a project to develop ARM server CPUs, would be open to developing a server CPU down the road, Huang didn't exactly rule it out. He did, however, stress that Nvidia will only pursue a chip project if it feels it can do something truly unique.
Should Nvidia buy ARM, it could subsequently work on end-to-end platforms supporting memory coherency and featuring its own CPUs, GPU and SmartNICs. Or if it wants to minimize conflicts of interest, it could simply invest heavily in ARM's Neoverse platform for server CPUs and network processors, and partner closely with existing ARM server CPU developers such as Marvell Technology (MRVL) and Amazon Web Services to bring such platforms to market, as these companies try to chip away at the dominant market share of Intel and AMD's x86 server CPUs.
6. Nvidia Could See a Lot of Value in ARM's Immense Mobile Footprint
ARM CPU cores are found inside of just about every smartphone on the planet, as well as a large percentage of tablets. And they can also now be found in a number of AR and VR headsets.
One thing that all of these devices have in common: They need GPUs as well. In some cases, fairly powerful GPUs that can power high-resolution displays and/or play graphics-intensive games at high frame rates. In addition, the SoCs powering a pretty large percentage of these devices now also contain (in addition to CPUs and GPUs) co-processors for handling AI/deep learning algorithms.
ARM is trying to address these needs as well, via its Mali GPUs and Ethos-N AI co-processors. But these fields are more competitive: Qualcomm's Snapdragon SoCs, for example, pair ARM CPU cores with proprietary GPUs and AI co-processors, and Samsung has inked a deal with AMD to use AMD's GPU IP within future mobile processors.
Given both its GPU empire and its work on integrating specialized AI processing cores (known as Tensor Cores) within many of its newer GPUs, Nvidia looks well-positioned to grow ARM's GPU and AI co-processor attach rates within mobile devices and AR/VR headsets.