Nvidia Joining Big Tech Deal Spree To License Groq Technology Hire

Bonisiwe Shabane
-
nvidia joining big tech deal spree to license groq technology hire

Nvidia has struck a non-exclusive licensing agreement with AI chip competitor Groq. As part of the deal, Nvidia will hire Groq founder Jonathan Ross, president Sunny Madra, and other employees. CNBC reported that Nvidia is acquiring assets from Groq for $20 billion; Nvidia told TechCrunch that this is not an acquisition of the company and did not comment on the scope of the deal. But if CNBC’s numbers are accurate, this purchase is expected to be Nvidia’s largest ever, and with Groq on its side, Nvidia is poised to become even more dominant in chip manufacturing. As tech companies compete to grow their AI capabilities, they need computing power, and Nvidia’s GPUs have emerged as the industry standard. But Groq has been working on a different type of chip called an LPU (language processing unit), which it has claimed can run LLMs at 10 times faster and using one-tenth the energy.

Groq’s CEO Jonathan Ross is known for this sort of innovation — when he worked for Google, he helped invent the TPU (tensor processing unit), a custom AI accelerator chip. In September, Groq raised $750 million at a $6.9 billion valuation. Its growth has been quick and significant — the company said that it powers the AI apps of more than 2 million developers, up from about 356,000 last year. Updated, 12/24/25 at 5:40 p.m. ET, with clarification from Nvidia about the nature of the deal. Nvidia has agreed to license chip technology from startup Groq and hire away its CEO, a veteran of Alphabet’s Google, Groq said in a blog post on Wednesday.

The deal follows a familiar pattern in recent years where the world’s biggest technology firms pay large sums in deals with promising startups to take their technology and talent but stop short of formally... Groq specializes in what is known as inference, where artificial intelligence models that have already been trained respond to requests from users. While Nvidia dominates the market for training AI models, it faces much more competition in inference, where traditional rivals such as Advanced Micro Devices have aimed to challenge it as well as startups such... Nvidia has agreed to a “non-exclusive” license to Groq’s technology, Groq said. It said its founder Jonathan Ross, who helped Google start its AI chip program, as well as Groq President Sunny Madra and other members of its engineering team, will join Nvidia. A person close to Nvidia confirmed the licensing agreement.

The chip giant is acquiring Groq’s IP and engineering team as it moves to lock down the next phase of AI compute. When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works. Nvidia has announced a $20 billion deal to acquire Groq’s intellectual property. While it's not the company itself, Nvidia will absorb key members of its engineering team, including its ex-Google engineer founder, Jonathan Ross, and Groq president Sunny Madra, marking the company’s largest AI-related transaction since... Nvidia’s purchase of Groq’s LPU IP focuses not on training — the space Nvidia already dominates — but inference, the computational process that turns AI models into real-time services.

Groq’s core product is the LPU, or Language Processing Unit, a chip optimized to run large language models at ultra-low latency. Where GPUs excel at large-batch parallelism, Groq’s statically scheduled architecture and SRAM-based memory design enable consistent performance for single-token inference workloads. That makes it particularly well-suited for applications like chatbot hosting and real-time agents, exactly the type of products that cloud vendors and startups are racing to scale. In a move that has sent shockwaves through Silicon Valley and Wall Street alike, Nvidia (NASDAQ: NVDA) announced a landmark $20 billion licensing agreement and strategic "acqui-hire" of AI chip disruptor Groq on December... The deal, finalized just as the market closed for the holiday break, represents the most significant consolidation of AI hardware power since the start of the generative AI boom. By integrating Groq’s high-speed Language Processing Unit (LPU) technology into its own massive ecosystem, Nvidia is positioning itself to dominate the "inference era"—the phase where AI models are deployed at scale rather than just...

The immediate implications of this deal are profound. Nvidia is no longer just the king of AI training; it has effectively neutralized its most credible threat in the ultra-low-latency inference market. As the industry pivots toward real-time "agentic" AI and digital humans, the ability to process tokens at lightning speed has become the new gold standard. With this deal, Nvidia has not only secured the intellectual property necessary to maintain its lead but has also absorbed the engineering talent responsible for the world’s fastest inference architecture, setting the stage for... The agreement, valued at roughly $20 billion, is structured as a non-exclusive licensing deal paired with a massive "acqui-hire" of Groq’s core leadership and engineering teams. This complex structure was reportedly chosen to navigate the increasingly treacherous waters of global antitrust regulation.

Under the terms, Groq’s founder and CEO Jonathan Ross—a primary architect of the original Google (NASDAQ: GOOGL) TPU—and President Sunny Madra will join Nvidia’s executive ranks. Meanwhile, Groq will continue to operate as an independent entity under new CEO Simon Edwards to maintain its existing cloud service contracts and avoid direct competition with Nvidia’s primary data center customers. The timeline leading up to this moment was characterized by a quiet but intense bidding war. Throughout late 2025, Groq had seen its valuation soar to nearly $7 billion as its LPU technology consistently outperformed Nvidia’s Blackwell architecture in raw inference speed for large language models (LLMs). Recognizing that the "memory wall" of traditional GPU architectures was becoming a bottleneck for real-time applications, Nvidia CEO Jensen Huang moved decisively to bring Groq’s deterministic Tensor Streaming Processor (TSP) architecture into the fold. The deal was reportedly fast-tracked in November after Groq demonstrated a 10x speed advantage in "prefill" latency for the latest Llama 4 models.

Market reaction has been overwhelmingly bullish, though tinged with awe at Nvidia's aggressive tactics. Analysts have dubbed the move the "Inference Play of the Decade," noting that it effectively closes the gap in Nvidia’s hardware stack. By merging the parallel throughput of GPUs with the sequential speed of LPUs, Nvidia is creating a heterogeneous computing platform that competitors will find nearly impossible to replicate. The timing, just days before the 2026 Consumer Electronics Show (CES), suggests that Nvidia is preparing to unveil a new category of "Inference-First" hardware that could redefine the personal computing and data center markets... Listen to this article in summarized format (Catch all the Technology News News, and Latest News Updates on The Economic Times.)

People Also Search

Nvidia Has Struck A Non-exclusive Licensing Agreement With AI Chip

Nvidia has struck a non-exclusive licensing agreement with AI chip competitor Groq. As part of the deal, Nvidia will hire Groq founder Jonathan Ross, president Sunny Madra, and other employees. CNBC reported that Nvidia is acquiring assets from Groq for $20 billion; Nvidia told TechCrunch that this is not an acquisition of the company and did not comment on the scope of the deal. But if CNBC’s num...

Groq’s CEO Jonathan Ross Is Known For This Sort Of

Groq’s CEO Jonathan Ross is known for this sort of innovation — when he worked for Google, he helped invent the TPU (tensor processing unit), a custom AI accelerator chip. In September, Groq raised $750 million at a $6.9 billion valuation. Its growth has been quick and significant — the company said that it powers the AI apps of more than 2 million developers, up from about 356,000 last year. Upda...

The Deal Follows A Familiar Pattern In Recent Years Where

The deal follows a familiar pattern in recent years where the world’s biggest technology firms pay large sums in deals with promising startups to take their technology and talent but stop short of formally... Groq specializes in what is known as inference, where artificial intelligence models that have already been trained respond to requests from users. While Nvidia dominates the market for train...

The Chip Giant Is Acquiring Groq’s IP And Engineering Team

The chip giant is acquiring Groq’s IP and engineering team as it moves to lock down the next phase of AI compute. When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works. Nvidia has announced a $20 billion deal to acquire Groq’s intellectual property. While it's not the company itself, Nvidia will absorb key members of its engineering team, including i...

Groq’s Core Product Is The LPU, Or Language Processing Unit,

Groq’s core product is the LPU, or Language Processing Unit, a chip optimized to run large language models at ultra-low latency. Where GPUs excel at large-batch parallelism, Groq’s statically scheduled architecture and SRAM-based memory design enable consistent performance for single-token inference workloads. That makes it particularly well-suited for applications like chatbot hosting and real-ti...