Technology

Musk Says xAI Used OpenAI's Technology in Ongoing Legal Battle

Elon Musk testified that his company xAI used OpenAI's technology through a technique called distillation to train its models, as a lawsuit over breach of charitable trust and unjust enrichment began

Martin HollowayPublished 7d ago5 min readBased on 2 sources
Reading level
Musk Says xAI Used OpenAI's Technology in Ongoing Legal Battle

Musk Says xAI Used OpenAI's Technology in Ongoing Legal Battle

Elon Musk testified in Oakland federal court that his company xAI partially trained its AI models by studying OpenAI's technology, as jury selection began for his lawsuit against OpenAI—the nonprofit-turned-profit company he helped found in 2015. The case centers on claims that OpenAI broke its charitable trust and unjustly enriched itself after shifting away from its original mission.

What the Lawsuit Claims

Musk is suing OpenAI on two main grounds: breach of charitable trust and unjust enrichment. The breach claim targets OpenAI's original nonprofit structure. When Musk and co-founder Sam Altman started the company in 2015, it was supposed to be a nonprofit focused on developing safe, beneficial artificial intelligence for humanity. Musk's legal team argues that OpenAI abandoned that mission, especially after converting to a capped-profit entity and partnering with Microsoft.

The unjust enrichment claim goes further: it says OpenAI improperly benefited from what Musk and other early supporters contributed—money, ideas, and strategic input—during the nonprofit era.

How xAI Got the Technology: The Distillation Question

Musk's testimony included an important admission: xAI used a technique called distillation to learn from OpenAI's models. Distillation works like this: imagine you have a large, powerful AI trained on massive amounts of data. You can train a smaller, more efficient AI to mimic what the large one does. The smaller model captures much of the large model's knowledge but runs faster and costs less to operate. It is a standard tool in AI development.

What makes this legally tricky is that xAI used distillation on OpenAI's technology. This raises a novel question: if one company's model learns from another company's model, especially when those companies have a shared history, who owns what. According to court documents, the specifics of how xAI trained its Grok model remain contested.

From a technical standpoint, distillation is perfectly normal in AI research. Companies use it all the time to make models more efficient. But applying it across competing companies—companies that started together but split—is legally untested territory.

Why Musk Left OpenAI in 2018

To understand the tension, you need to know what happened six years before xAI was born. Musk left OpenAI in 2018 after disagreeing with the company's direction. He wanted OpenAI to either merge with Tesla or become a for-profit company under his control. The leadership refused. So he stepped away.

That timing matters. The year Musk left, OpenAI was beginning to shift. It was moving from pure research into building practical AI products that could make money. A few years later, OpenAI created a capped-profit subsidiary and partnered with Microsoft, bringing in billions in funding. Those moves transformed the company's mission and structure.

Looking at the broader pattern, disputes like this arise regularly when technology companies shift from nonprofit or research focus to commercial models. Early contributors often feel the mission has changed or that they should have benefited from the company's later success. Similar tensions have emerged in open-source software, early internet infrastructure, and other foundational technology domains. The difference here is that Musk did not simply leave quietly—he founded a competitor.

What This Case Could Mean for AI Companies

The immediate question is whether this case will set legal rules for how AI companies can use each other's technology, especially when people have worked at both. But there is a broader question lurking underneath.

If courts decide that Musk and OpenAI's original nonprofit mission was binding—or that knowledge transferred through distillation cannot be used competitively—it could change how AI researchers and companies approach model development and knowledge sharing. Right now, the field assumes fairly open knowledge flow: people move between companies, techniques spread through academic papers and conferences, and competitors learn from each other's public work. A court could narrow that.

The breach of charitable trust claim is particularly significant for the AI industry. OpenAI is not the only research organization trying to straddle nonprofit and for-profit worlds. As more AI labs navigate that tension, this case could establish what obligations they truly owe to their original mission and early stakeholders.

The Competitive Stakes

xAI, launched by Musk in 2023, competes directly with OpenAI to build advanced language models and AI systems. Both companies want to reach artificial general intelligence—a hypothetical AI smarter than humans at most tasks—though they are taking different technical and business paths. The lawsuit is one front in a broader competition.

The distillation admission also points to something worth considering: in AI research, it is hard to keep technology truly secret. Knowledge leaks through published papers, engineer mobility, and techniques like the one Musk admitted to. Traditional ideas about competitive advantage and intellectual property do not always fit neatly.

What Comes Next

This case will likely become a test of how courts handle complex technical disputes in AI. The jury will need to understand distillation, model training, and how knowledge flows between AI systems. They will also grapple with harder questions about nonprofit obligations and mission drift.

The outcome could influence how AI companies structure themselves, handle ownership of research, and manage transitions between nonprofit and for-profit forms. For an industry where much foundational work still happens in universities and nonprofits before moving to commercial use, the precedent could matter for years.